Friday, May 26, 2023

Micro-services pattern. good and bad.


Some patterns get a lot of press, some don’t. Microservices have enjoyed a good run so lets review the good and bad. Like any architecture challenge; using a pattern is really useful in the correct context, but seeing a pattern as a solution for all problems can be really problematic.


the good. 

Microservices can be taken on by a single team, scaled to handle a specific responsibility of the architecture. Services like Auth (auth and accounts), Notifications, Webhooks or other event handlers are usually good candidates to exist as a service


the bad. 

When not being pragmatic, the developer will apply a pattern for the sake of it, in the hopes of getting the benefits. This has led to extremely complex application, where the networking and added infrastructure can lead to slow and buggy applications


take-away. 

Be pragmatic; understand the problem you are solving and the pattern for implementing the solution will become apparent.

Focus on the functionality of the architecture and the other system attributes of performance, security, portability and maintainability will let you know when you want to split out a feature into its own service


A little history


The idea of decoupling functionality into distinct services has been around for about 20 years now. Web Services as a pattern started showing up as the networking and standardization of the protocols started to mature. 


Early networking was done with unique and proprietary protocols, like IIOP and CORBA. Once XML became an standard and the speeds of the infrastructure caught up, the basic setup of XML / HTTP became the common standard. This has now evolved into the JSON / HTTP we use mostly today. 


Smaller web services


Netflix was growing so fast that grouping teams to the functionality and making the deliverables from the teams into distinct services showed a lot promise. It seemed like a team of 5 could handle the functionality for a major software feature, and these big features can be deployed independently and they would communicate to form the product


From this success, the term ‘micro-services’ became a buzzword, and engineering departments felt the need to keep up and adopt this pattern.


Scaling a specific feature


You can always put your monolith app into a kubernetes setup and scale it up. For any service that in the API pattern, where its a stateless processor of requests; this should work fine. 

The trouble can start with other services that have more state and are used differently.


There has been some clear benefits to isolating specific features in the architecture.


Auth (auth and accounts) 

  • this is a high traffic service that needs high quality. using a microservice to handle all the details around identity and permissions, and be able to see this scale. I have seen an auth service running in kubernetes for a very large e-commerce site. On the black friday shopping event, this had over 4200 nodes running!


Notifications 

  • Sending out notifications, and dealing with the dependencies to do that work requires a lot of configuration and specific behanvior.


Webhook or other event handlers. 

  • This type of service is handling a lot of incoming requests and need to have specific configuration and infra.



Too many services


When applying micro-services to any problem, there can be a real problem with the associated complexity. This is sort of the same affect of having many small required libraries in any framework. The whole thing starts to bloat and in the end you could end up wth a monolith of services.


Also, the dependencies between services and the teams developing them will produce a lot of overhead. This can really slow the velocity of an product cycle.


Performance of service based architecture can be improved on the networking side with protobuf and other lower level protocols, but the state management is always going to be a complex problem. How big of a challenge do you need this to be?



Let the problem determine an effective solution. 


Patterns emerge from the architecture, applying patterns for the sake of it just adds needless complexity. Focus on functionality that brings value to your product and as you break down the dependencies and understand your architecture the need to split a specific piece of that functionality into a container will become obvious. 



Being an effective developer in a company


Lessons learned when developing software in a company:

Add value

  • Add value. Look to solve problems, not just build stuff. 
  • Be Pragmatic.
  • Be a professional; ego trips and flame wars have no lasting value, professionalism does
  • Learn, there is always a better way.
  • while(developing) { Have a personal process and refine it }
  • keep a journal/log/todo at all times. keep a running queue or stack (depending on how you organize)

Be a good teammate

  • don't try to keep everything in your head, put designs and thoughts in paper to formalize.
  • Extremely high rate of return on information sharing. Collaborate.
  • Utilize tools so you don't have to make new ones that do the same thing.

Specs

  • Implement to spec. If so spec exists, document. The document is now the spec.
  • Writing code is part of the solution, after the problem is understood.
  • Write to the API before creating it, make it something you would want to use.

Quality

  • Time to do it right. If the time doesn't exist now, when will it?
  • Are you making good software, or just trying to make some software work? Making a quick patch just pushes the time to do it correctly into the future. Fixing will make the problem go away forever.
  • There is always work to do in areas of performance and code clarity/quality. Schedule this as part of the time spent.

Complexity

  • Aggressively attack complexity or it will catch up with you.
  • Don't be too clever for your own good if clever adds complexity. Be clever in simplifying. Complexity will cause brittle code and crusty developers, avoid needless mental work in code.

What math is needed for software development?


We sometimes see the articles or comments that you don't need math to be a good programmer.
The thing is, software development has its roots in computer science, and cs is applied mathematics. So, it is true that you don't need math to do software development, it just makes it a lot easier if you do. How good do you want your work to be?

In the more broad spectrum of software development there is so much to do. Planning, designing, reviewing, etc. Do they have the design skills to make it look good? Do you have the empathy skills to know what the customer wanted in the first place? These don't seem related to math at all; but in pulling them all together to create a solution you would benefit from the problem solving skills that math provides

What actual math is needed for programming?

Computer Science is Applied Mathematics, so to be a computer scientist you would need a strong mathematical foundation. For making software, you don't have to be a full-blown computer scientist, but first you would have to use and understand the logic and data involved and how these two things work together to create your program.

One cannot do computer science well without being a good programmer, nor can one understand the underpinnings of computer science without a strong background in mathematics. Education must make students fluent speakers of mathematics and programming, and to expose them to both functional and imperative language paradigms so that they can effectively learn computer science.

Early programming courses and discrete mathematics will articulate the strong ties between mathematics and programming. Then the coursework should bridge the gap between the mathematical perspective and the implementation of an algorithm as a sequence of instructions through an imperative language.

I have thought of programming as largely a combination of set theory and predicate logic. Category theory may be a better way of going about the first part, as it is really set theory when combined with functions. I'm not sure if that replaces logic as much as it extends it. I'm really seeing more light in functional approach as the way to glue the concepts together.

    How to get there


    The first year programming course should not be viewed as computer science in it's entirety. It is a formal language and propositional logic course, which is a foundation aspect to CS, but doesn't represent the entire profession.
    • set theory
    • predicate logic
    • combinatorics
    • probability
    • number theory
    • algorithms
    • algebra
    • graph theory
    • Understand sets and how regular algorithms apply to them
    • Functions ,Transcendental functions, including trigonometric functions, logarithmic and exponential. Algebraic vectors. Combinatory logic is the root of lambda calculus 
    • Computability and Turing style computer science is a bit at odds with formalism. It's a different philosophy of the nature of mathematics

    Logic - first order logic http://en.wikipedia.org/wiki/First-order_theory theory of computation second order logic and computational complexity np complete also need to understand relations: unary, binary, ternary, n-ary important for iterations over sets with algorithms, check out STL style of applying functions. relational data

    Numerical methods for solving simultaneous linear equations, roots of equations, eigenvalues and eigenvectors, numerical differentiation and integration, interpolation, solution of ordinary and partial differential equations, and curve fitting.


    Wrap it up

    Math is the language of technology and the computer science is applied mathematics. The more you know about these foundations allows you to make better software.




    Communication between web pages and server applications

    Web sites were originally intended as a tool for reviewing academic papers, but have evolved into a platform for applications that people use every day. That is a big change, and to enable that change software developers have created many tools and discovered some patterns along the way. Some tools and patterns  have worked well; and some not well at all. Here I present some findings from making web application over the last 20 years. Lots of learning in that amount of time!

    This is part 2 of a set of posts; so this article assumes you can make a web page with HTML and javascript; but have a hard time understanding how that page works with a server, api, and not sure about the http, ssl, or other terms that you have heard along the way.

    What are we making?

    For this example we want something simple, but has some logic to understand how the pieces work. For this we have a fictitious client named Terry. Terry works at the weather office and needs a web app to track weather data, like temperatures. Terry needs a weekly report for this weather data, and is waving around pile of money to get it done. Let's get this done for Terry. 

    How are we making it? 


    We are going to use Python to accomplish this task. There are many choices when it comes to web servers and application are built in various languages (Java, PHP, ruby, javascript) but we are going to use just Python to cut down on the complexity of setting up a lot of tools

    Servers
    First lets understand how the page in your browser interacts with 'servers'. When you type the address in the browser you are requesting content from that address. What does that mean? It's really the same as opening the file on your own computer, and you notice in the address bar the difference; opening the file locally has the address starting with file: and opening the file from a server has the address starting with http: What are those indicating? They are the protocols the browser uses to get the content. Files are local to your machine when you are using your machine, so the 'file' protocol is used to get that. Files served from web servers is content from another machine, so the browser uses the 'http' protocol to get those. This stands for 'hyper text transfer protocol', and it is exactly that; a protocol to transfer hypertext files around. I suppose the 'file' could be renamed "hyper text file protocol" but htfp isn't that descriptive, so file is used.

    Whats after the protocol? the name of the domain you are requesting the resource from; this is whatever.com or your favorite site

    Start a web server on your local machine, and get the file with your browser from that webserver, instead of your file system. You get to use the http protocol to do this, and you use the domain name of your local machine. This is the 'localhost', but as always that name maps to a IP address, and in the localhost case this IP address is 127.0.0.1. All domain names map to IP addresses. This is the internet magic!

    Use apache or nginx or any webserver to do this; here is a handy link to get started with the simple http server in python: https://developer.mozilla.org/en-US/docs/Learn/Common_questions/set_up_a_local_testing_server

    Methods
    HTTP protocol has a number of methods to interact with webserver to get some content from the server or send some content to it. These are requests and responses, and in http we will cover the most commonly used requests; GET and POST.
    GET is what is says; I want to get some content, and POST is I want to send some content. If you type your address in the browser its a GET request, but if you fill out a form in a page and press save; you are probably sending a POST request.

    The vast majority of web and application requests are GET requests, and those just fill your browser with HTML text content. Sending data to the server is a different story so lets dive into details of the POST and how to use it to make web applications. Use your html skills to create a form element, and inside that put your input field and submit button. When you POST to localhost, you are sending a request that contains all the info about your browser, and the value of the field you just filled in. How do you handle that with the server?

    Flask
    The application server we are going to use here is Flask. There are many different ones for many different languages and this isn't a comparison post, but instead an effort to explain the basics and apply some fundamentals to get you on the road to web development nirvana.

    http://flask.pocoo.org/docs/1.0/quickstart/#http-methods

    Web pattern 1.0


    GET the page and POST your data. The server will get the data from the POST, but what happens when you hit refresh on your browser? The same data will POST again. This can be an issue, since the user just wanted to send the data once, so in your boundary object the good practice here is redirecting to a GET request, so the data is saved and you don't have the issue of re-submitting data. This gets us further, but 2 problems now remain.

    1. To show everything on the page you have to get all data to do that, plus any changes to the data from the POST call
    2. The experience from the user isn't so fun as the browser is constantly getting redirected, causing a 'flicker' effect when dealing with many pages.
    So, wouldn't be nice to just leave that page in the browser, and POST the data some other way? This was the impetus for the technology pattern we know as AJAX. AJAX lets you POST using javascript and get the response from the server through the same function. Browsers after the late nineties supported this feature.

    Back then you had to get creative. I made a Java applet back in 1998 that was very much an AJAX app. It was only about 60kb and it just had the basic networking to get and post xml text to the server. Another applet was the toolbar in a frame, which switched the pages in the main frame. It was a reporting tool for a server side application. The applet interfaced with the javascript in the browser through a JSObject object from netscape. Getting this Java applet to work in IE on the Mac was quite the trick, but the experience using the page was very slick. About that time the httpRequest object became available in many browsers, and AJAX became a widely accepted pattern; and spawned a new buzzword: web 2.0.

    Patterns learned

    GET to POST.

    • Use Get to initialize a view of html, but don't pass data to the server using GET (unless its a temporal piece of data like a token)

    POST redirect to GET. 
    • When POSTing data, redirect the users request to a GET request that confirms what they posted and prevents resubmits from the form that aren't necessary.

    Web pattern 2.0


    With the advances in web page development as a graphical platform, and the underlying AJAX technology, web pages started to feel like desktop applications. Plug-ins like java applets and flash were taking hold, but developers were seeing how the html/css/javascript basic web technologies were maturing into a package we could build applications in.
    Around this time XML was the cool new technology. It was a way to interchange data on common http protocols, and easy to use because it was just formatted text. The X in AJAX is for XML, and for a few years we were posting XML and getting XML responses and updating our web page elements with those results.
    This was great, but one more improvement was to come. Since we use JavaScript through the browser, and sometimes on the server, the use of XML as a way to interchange data became a bit cumbersome. A JavaScript guru named Douglas Crockford came up with a way to represent all the capabilities that XML has to interchange data, but use JavaScript for that. JSON was born and it's now the better way to send and receive data from the server

    Patterns learned


    POST Json with AJAX and do so consistently for any form. On the server its much easier to deal with consistent data structures, and with this pattern the client and server have a common type of data structure to use. Why didn't the term become AJAJ? I guess that may be more correct, but let's not get caught up in the pedantic details and move forward.

    Security and Performance
    SSL, sensitive info

    keep the payload light, avoid redirects

    Server patterns are next

    Developing web pages with HTML and CSS

    These are notes I have been using to teach building web applications, https://github.com/jseller/repo

    That is a very large topic that is too much for one post. So, for easier learning this has been split into four parts:

    1. Web page development 
      1. HTML, the fundamentals of HTML, CSS and JavaScript
    2. Understanding communication between the browser and a web server. 
      1. using HTTP and and a server API
    3. Building web applications 
      1. Patterns for building web applications that service requests from the front-end and save data
    4. Data and databases
      1. Structuring data and making scale-able applications

    We are going to build a simple web app that explains some fundamentals so you can see how everything works together. First we are going to build a page, and move to the middle tier of logic and the bottom tier of data persistence. These concepts require a much more thorough knowledge of networks, operating systems and databases. We will get there; but lets start with a simple page.


    Front-end

    The web page is the visual 'front-end' of you web application. This is what the user actually sees and interacts with. Its like looking at a house, car or a device. For all the technology that is happening under the covers, it's the first impression of just looking at it that can make or break its success. So, it may be a good idea to think of the web page as the decoration and design of the house interior. The walls have to be built, but it makes the room much nicer if the walls were painted and decorated. This highly visual work demands a view from the design and customer side more than the machine running it. The use of graphics and graphic artistry play a big part in successful web development, but the success is really how usable it is. The experience the user has to solve their problem or need is what determines success. This is the X in the terms 'UX' or 'CX'  that you may have seen before.

    Get Started

    Here we learn how to develop a web page with just enough HTML, CSS and JS to understand the fundamentals. The tools we will use are a text editor and a web browser.

    • First basic HTML, and understanding how a web page is just a document
    • Then add CSS, and learn the browser visually formats the HTML
    • Then add Javascript, and learn how we can add interaction to the site and the basic programming behind responding to users actions

    HTML

    The Browser
    You have a browser on your computer, and this is the window you view the world wide web through. Its Chrome, or Firefox, or Safari, or Internet Explorer, or Lynx.
    1. Start it up and go to this site
    2. View source
    3. Save as a web page
    4. Go to the folder that you save the page
    5. open the file with your web browser.
    Websites are just html files on another computer. 

    The Editor
    You have an editor on your computer; its called Notepad or TextEdit. While these can edit and save documents they don't have a whole lot of extra features. Document editors like Word have many formatting features for documents. For this exercise we are use Sublime Text editor, but notepad++ or Atom are useful in the same way: they highlight the HTML, CSS and JS so you can easily see what parts of the page are for the browser and what is the text or pictures that the user can see.
    1. Start your editor
    2. Open the file you saved
      1. Files have types and this lets the computer know what to use. html files are processed by a web browser.
    3. See the mix of html and text, this should pretty much match what you saw in the 'view source' option in your browser
    4. Flip back and forth between your editor and your browser, so you are using both tools to see the same file. 

    Change the web page

    1. Put your name in the paragraph tags
      1. save the file and refresh the browser
    2. Add a heading. There are different headings, h1, h2, and they get smaller as the numbers increase.
    You can see at this point that the 'elements' you are putting around the text don't show up in the browser; they are there to tell the browser how to display the text in-between. These should be familiar if you have used any document editor before. Paragraphs, headings, bold, italic, etc. This is because the first version of html was made to show documents (actually academic papers)

    Add a list of things

    There are ordered lists and un-ordered lists. Add one to your page

    Add an image 

    Images really make a web page come alive.

    Add a link to another page

    This is really the core concept of the 'web' of pages that we have today. You can tell the browser to load any web page on the web by adding a link. When the user clicks the link the page is loaded.

    HTML Guidelines

    Web pages have come a long way. What was originally just a way to format text documents has evolved into the application development user interface it is today. That's a large evolution in a short period of time!

    To keep up with the the HTML specification has evolved to add many more standard elements. You could actually just use 2 elements to replicate what the standard document elements do, but you need to tell the browser the rules to display them. This is what style sheets, or Cascading Style Sheets are for.
    What elements are these building blocks? they are 'div' and 'span'. Think if the div as the structural element that contains a specific part of the page, and the span as a way to change the presentation of the elements within a div.
    • div as structure, span as presentation
    Also, with the class or style attibute on the elements defining the presentation of the element, there is also the 'id' that uniquely identifies the element in the entire page. This will become very important when we add javascript to the page later.
    • use class to style the element and ids to identify the elements. This way you can change the look and feel of the page by just changing the CSS and not the html

    CSS

    Yes, everything is better with a little bit of style, and your web page isn't any different!

    StyleSheets came around when web pages were pushing past the original paradigm of the text document, and the limitations of that in creating visual experiences. Different fonts, colors, sizes were needed to show the elements of the document in a visual way.

    • add a style section to define some rules for the elements in your page

    Notice we are adding the rules to the page itself, and the page is getting larger, but not too unmanageable. Now imagine a much more complicated website with many rules that could apply to each element. That style section would get too large to easily navigate. To solve this you can (and should) put your style definitions in a separate CSS file and refer to it in your base page.

    Also notice that you can add style as an attribute to any html element and the style rules will apply to just that element. This becomes hard to manage as you will want to change style rules that affect the entire document; and changing many of these definition would be hard to manage when the page gets more complex. Its a great idea to define the rules once, and use the class attribute to declare what style rules the element used.

    • try not to style specific elements, but instead define rule for them separately. its much easier to manage when things get large.
      • Caveat for html emails. For nicely formatted html emails you have to have all the elements and style definitions in the same file so the email readers have a chance of showing the content consistently.

    CSS

    CSS: use ID for structural selectors and class for presentation selectors
    structural elements by id - how the block elements align themselves in the entire document
    presentation element by class - color, margin, border, alignment within the element

    Name style by content 
    dont do: .red{color:red;}
    do .error{color:red;}\


    Wrap it up

    Making web pages is fun, and to make complex front end applications its really important to understand the basic fundamentals of markup languages and the formatting of them for effective application presentation




    Programming Languages: the more the merrier

    What your favourite language? All of them!

    This post has some random notes on random programming languages. I'm just keeping some links from when I have used the various languages for different products and projects.

    I find programming languages fascinating; you can instruct the computer to... compute things.


    Programming language design is fundamentally mathematical. It follows rules defined in set theory and logic (predictive and propositional)

    Every powerful language has
    1. PRIMITIVES (the simplest entities)
    2. MEANS OF COMBINATION (to create complex entities)
    3. MEANS OF ABSTRACTION (treat complex entities as if they were primitives) 

    Languages and Environments

    Does it matter what language used? The only thing that can be guaranteed is that, one day, there will be a better way to tackle a problem. Any problem. The language is a tool to create a solution for a particular problem, at a particular time. It is the best solution at the time, but there will be a better on around soon, so don't get too attached to it.
    There is no "real man's" language either. Some are more difficult to do well, but there can't be any impression that one is a magical language that does everything well. As a consequence of the environment people can find themselves in, they could become fans of a particular language, but they should tread carefully and pay attention, or else they will miss the next change.
     
    It does say something about a programmer when they are willing to learn a new language without having required to from work or school. Programming languages are tools in a toolbox. The more the better. If you only have a hammer in your toolbox, then the only problems you can solve are the ones involving nails.
     
    If programming languages were cars? www.cs.caltech.edu/~mvanier/hacking/rants/cars.html
    Should programmers be language-independent? blog.reindel.com/2007/08/28/should-programmers-be-language-independent lang: http://www.cs.rice.edu/~taha/teaching/05S/411/ lang: http://www.newartisans.com/2009/03/hello-haskell-goodbye-lisp.html A timeline of programming languages: http://www.levenez.com/lang/history.html
    Which are faster? http://shootout.alioth.debian.org/gp4/benchmark.php?test=all&lang=all

    Most programmers start programming by using a Procedural language. These are bits of logic, and when one piece gets too big it is split up into various functions to make the whole thing more readable.
    C, Pascal, Shell scripting, BASIC, web page functions: these are examples of procedural languages.
    A good course to take when learning programming is Comparative Languages (or some other similar title). This sort of course requires that a working useful program be constructed with each one of the various styles of programming. A language is picked to demonstrate the use of Iterative, Object oriented, Functional, Prototypical and another that mixes styles. This is helpful to distinguish the styles in a different syntax and semantics.
     
    Essential to this is the understanding of 'static' and 'dynamic' typing (Early binding and Late binding of values). When comparing the two approaches, a deeper understanding of Compilers and Interpreters is gained in the process.

    Static or Dynamic typing

    It's really a system of typed 'things' that you are dealing with: http://en.wikipedia.org/wiki/Type_system
    http://okmij.org/ftp/Computation/type-arithmetics.html
     
    Types are associated in early or late binding Languages will have basic built in types like numbers and characters.
     
    For statically typed languages the programmer is concerned with flow, understanding the machine, thinking assembly, when programming. It is on the onus of the programmer to create the code that can run fast on a particular processor. This is a skill that's hard to find.

    For dynamically typed languages the programmer allows the computer to profile and optimize the code.
    Then isn't dynamic typing better? Well, it can be, but due to the run-time nature of the type checking, there is a danger of the problems only creeping up during run-time. These problems would have otherwise been caught at compile time.
    This risk can be alleviated with extensive unit testing.
     
    Also, to make the system more 'dynamic' most statically typed languages allow type casting which occurs at run-time anyway. You'll get the same problem.
    Interfaces and the separation of the implementation concerns from them is a cornerstone of industrial OO development. Dynamic languages are difficult use in a large team/ shared codebase because of their lack of the interface.
     
    This may only be a matter of communication and documentation. A poorly written statically-typed interface will be a problem no matter what.
    So, it is safer to use a statically typed language, but all the features that have been added to make these type systems more dynamic spoils this by the resulting runtime type checking. There has to be room for both: static interfaces that are safe to use and dynamic typing when needed.
    The inevitability of dynamic languages? www.dotnetrocks.com/default.aspx?showNum=277

    Dynamic Typing

    Lisp/Scheme

    to add one and two:
    (+ 1 2)

    Lisp, at first, may seem like more of a collection of syntax than a language. It's really a great language and environment when you get into it. You can also learn some great strategies that carry over to other environments. It has pedantic regularity (with everything being an s-expression)

    It's pretty amazing that a language invented in the early 50's has continued to grow more productive and popular. It is still lacking in many of the tools for the style of web applications that are needed today, but that's only a matter of time. Though time may be not on lisps side when considering the momentum behind Python and Ruby. Again, it is still around, so we'll see.
    http://www.paulgraham.com
    http://www.norvig.com/
    Scheme is a subset of Lisp, with the intention to be correct and a good base for teaching.

    An Introduction to Computer Science Using Scheme: http://www.gustavus.edu/+max/concrete-abstractions.html

    StandardML
    OCaml: http://www.ocaml-tutorial.org/ http://caml.inria.fr/
    F# http://research.microsoft.com/fsharp/fsharp.aspx http://blogs.msdn.com/chrsmith/archive/2007/11/10/Project-Euler-in-F_2300_-_2D00_-Problem-5.aspx
    http://en.wikipedia.org/wiki/Haskell_(programming_language) http://cgi.cse.unsw.edu.au/~dons/blog/2006/12/16#programming-haskell-intro
    XSL is a language for transforming and formatting XML graphs.

    Static typing and Curly braces

    C, C++, Java, D and others are all Algol derived languages

    D

    Walter Bright has done a huge amount of significant work over the years; writing a compiler is a serious piece of work. http://www.walterbright.com/

    http://boscoh.com/programming/some-reflection-on-programming-in-d-and-why-it-kicks-serious-ass-over-c-leaving-it-died-and-tired-on-the-sidewalk#cpreview
    http://www.dsource.org/

    C++

    C++ is a programming language that is the most mature production quality language in use today. It's my bread and butter.
     
    From a c++ implementation view, understanding how a nice memory managed smart pointer works will help with issues in the various components tying together. If things get leaky, it gets bad in a hurry. You could have an engine like Ogre, with some other components from the operating system (rendering libs like Directx and opengl), and then ties to other stuff; so using Boost and STL will help. Once everything seems great, get a good profiling tool.
     
    STL - this is a such a standard library that is has to be considered as part of the language itself. Use of the algorithm approach on generic containers solves many problems. STL collections, iterators STLPort - http://www.cprogramming.com/tutorial/stl/stlmap.html The C++ Standard Template Library (STL), a generic programming paradigm that has been adapted to the C++ programming language, and is an extensible framework for generic and interoperable components.
     
    c++ http://nuwen.net/14882.html 
    BOOST - you will need a good reference counting smart pointer. The shared_ptr and auto_ptr will come to your rescue. Use boost unit tester
     
    http://www.boostcookbook.com/Recipe:/1235053 http://www.boost.org/libs/test/doc/components/utf/index.html http://sourcemaking.com/refactoring/split-temporary-variable C/C++, C# cheat sheets www.scottklarr.com/topic/121/c-cpp-c-cheat-sheets Linking C++ objects using references the-lazy-programmer.com/blog/?p=12
     
    C++ make everything that can be a const a const, inline small setter/getters

    Scott Meyers : Effective C++. Just read it and do what he says.

    Modern C++ Design: Generic Programming and Design Patterns Applied by Andrei Alexandrescu

    Herb Sutter
    http://www.cppreference.com/

    www.cprogramming.com/
    www.cuj.com

    C#

    C# was created to bring C++ style to the Microsoft (ECMA standardized) Common Language Runtime. The language was created by Anders Hejlsberg, who made Turbo Pascal; my first programming environment.
    It has great libraries and MS support, and many features to allow it to act like a functional structued language.
    C# 3.0 tutoria www.programmersheaven.com/2/CSharp3-1

    C++/CLI

    C++/CLI is the implementation of standard C++ for the Microsoft Common Language Runtime. It's the 'managed' version of C++, so it's has support for memory management that is lacking in traditional 'unmanaged' C++.
    Here is a good intro: http://www.codeproject.com/managedcpp/cppcliintro01.asp

    Java


    Java is a programming language originally developed by Sun Microsystems and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode that can run on any Java virtual machine (JVM) regardless of computer architecture.

    Java was created with many features to try to make high-level programming easier. It's a safer, more mellow, C++. While C++ has direct access to the hardware, Java is interpreted in a virtual machine.
    It's really taking its place as the 'enterprise' language, sort of the current standard for most programming. This is reflected through it's use as the most common teaching language in north american universities. As long as the programs don't just teach one language this is fine. However long Java lasts in this place, it has brought many people to programming that would have stayed away otherwise.

    When Java came out in 1995 it came with a lot of hype about this safer style of handling memory allocation and clean-up through the garbage collection process. It is an interpreted language, so instead of compiling it to a specific platform, the code is compiled into an intermediate bytecode language, which then is run on a Virtual Machine. The VM is written to specific platforms. This enables the bytecode to run, unchanged, on many different platform.

    The extra step of the virtual machine running the bytecode made Java much slower than C++ at first, but since then the speed of the hardware and optimizations to the bytecode and the virtual machine have closed the gap considerably. Java is, 10 years on, an industrial-strength langauge now.

    Java has a cool feature called Reflection, where you can analyse objects at run time. C# has this same feature.

    I use the NetBeans IDE that Sun has since bought and is now presenting as an alternative to the Eclipse project. It's familiar, so its fine for me.
    Spring framework uses inversion of control to enable regular objects to be used without a lot of extra required.

    Scala

    Scala is a functional, descriptive language that runs on the Java Virtual Machine. It has many features around concurrency.
    http://www.scala-lang.org/
    http://langexplr.blogspot.com/2007/07/structural-types-in-scala-260-rc1.html

    Groovy

    Groovy extends the JDK libraries and make a great tool for all the engineering tasks around the Java is too clunky to do. It's has great dynamic features.
    //a program to convert a number into a text representation of its digits // 356 should say 'threefivesix' def num = '356' def words = ['zero','one','two','three','four','five','six','seven','eight','nine'] num.each(){println words.get(it.toInteger())}

    Clojure

    Clojure is a Lisp dialect http://clojure.org/rationale

    Pascal

    This language introduced me to the world of programming.

    Mocha/Javascript/ecmascipt

    Fascinating story about how this language came about.
    Use a good library to abstract yourself from inconsistent implementations in browsers. 

    this
    use of the keyword 'this' can cause issues in annoymous functions. You can assign the value of 'this' to another private member, but you also use the built in methods of 'call' and 'apply' to preserve the scope of the contents of 'this'.

    http://www.crockford.com/javascript/private.html http://jibbering.com/faq/faq_notes/closures.html Interesting views on the world's most misunderstood programming language and jslint from Douglas Crockford Javascript and JSON Javascriptkit for comprehensive language reference Excellent free 

    JS in the browser is the heart of the browser platform. Its allows the script to manipulate any part of the document currently loaded. An easy example is to write out some text and have function that changes it:

    Written in a short time by one person it has become the most used language in history. It's much maligned, and has some issues, but it's actually a wonder how it's worked as well as it's had, for so long.
    To encourage some standardization the neutral ecma spec emerged.


    Continuously integrate software and everything around it


    Continually integrating code increases velocity and quality

    While you are working away on a project or a paper, its easy to handle all the modifications on your own, since you are the only author. Things get interesting when many authors need to use the same source.

    Integrate early and often.

    Especially when you have the very first thing working, and you have a unit test and functional test doing the bare minimum; take a moment to integrate with the larger project, commit changes to the branch and generally harden the edges around what is newly build. 
    Once the resulting leaner version of feature is in the build, expanding on that feature is done more effectively as you have the build cycle moving with everything intact.

    Automate the building of the software. Once the scripts are done (and they are way easier than everyone seems to think they are) the project builds itself every night. Why not do this? Everyone is home sleeping, and the computer at the office are just running idle all night. Use that time.
     
    The build script, like all scripts, start small and go from there. The build script should do the basics:
    • Clean the source directory.
    • Sync code into staging area for building
    • Build all source in staging area. Full build too; no incremental, or half build shortcuts. Build dependencies, and any inclusions.
      Build target libs and place into target directories.
      Build binaries and link all necessary libraries.
    • Clean all intermediary files
    • Package binaries and libs for distribution
    • Configure target machine for deployment
    • Check network configuration
    • Deploy binaries and libs
    Each step some reporting can be done and the results archived. The script runs at night when everyone is asleep. The results are ready the next day, and everyday.
    Polish your build!
     
    CONTINUOUS INTEGRATION
    • Integration is the assembly of all the parts to end up with a deliverable 
    • building, testing, packaging, deploying.. 
    • One person doesn’t need an integration effort, but more than one person does 
    INTEGRATION
    • Artifacts are created: source code, and other dependencies 
    • Source from different places has to end up in one place 
    • There are many options, but using a source control tool is the best practice. 
    SOURCE
    • A common, neutral build is necessary. 
    • ‘Works on my machine’ works if you are distributing your machine 
    • A common build machine/environment eliminates local dependencies and gives confidence it can work elsewhere 
    • When to build all this? Weekly? Daily? 
    BUILDING FROM SOURCE
    • Once the source compiles, how do you know it works at all? 
    • Manually running the build and exploratory testing can take you so far 
    • Having unit tests can give more confidence that the code that is tested will work 
    TESTING THE BUILD
    • You can just zip up the compiled source, email it somewhere and run it… 
    • Error prone, it’s easy to forget menial steps 
    • Time consuming as new dependencies are usually found as a result of things ‘not working’ 
    • Run those tests to give confidence that all tests will pass. Tests that aren't in the dependency tree shouldn't be affected. Make unit, integration and functional tests run separately or all at once
    PACKAGING FOR DEPLOYMENT
    • Building, testing, packaging, versioning and deploying define a process 
    • If it is a defined process, it can be repeated. 
    • It’s ‘defined’ if you did it once. Never wait for a master plan to start, just evolve the plan as you go. 
    • The build is ‘broken’ if the entire process doesn’t complete. 
    IT’S A PROCESS
    • Once the process is automated, run the process as much as you can, not ‘when you need to’.
    • Always look to automate any repeatable process. 
    • Run the process continuously and be aggressive about finding more tasks to automate 
    • Why do all that work when a computer can do it for you? Computers just sit around all night and they don’t get tired. 
    AUTOMATE EARLY AND OFTEN
    • Spend your time actually building the technology, not doing the housekeeping around it.
    • Automate the source and change control, the build and packaging, the testing and verification, and the deployment 
    • Only one trigger is needed: when the source changes. 
    • Run on every change, every hour, all night… all the time. Who cares? It’s automated. 
    CONTINUOUSLY INTEGRATE EVERYTHING
    • Continuous integration tool: ’Hudson’ 
    • Source Control: git 
    • Building: using compilers specific to the code base 
    • Metrics: running analysis tools on codebase 
    • Testing: running unit tests on every build 
    • Metrics: running more analysis tools on built codebase 
    • •Packaging: auto deploying to development environment 
    PRODUCT
    • Allows builds anytime that have aligned versions

    Wrap it up


    Continuous integration is a great strategy to automate a lot of quality into your application, and over time you will end up saving a lot of time and increasing the general quality of the technology that you are making.

    Its a process; do a little each week and the results will quickly build up over time.




    Communicating software architecture

    Communication is key and communicating abstract concepts is difficult. In a remote working world, communicating effectively is more important than ever.

    This is about working on software in a team environment and communicating the rather abstract nature of software so your teammates can understand what you are making. 

    This is essential to a group making good software together. The audience will have various levels of skill and function, how to you get them what they need?

    Who are you communicating to?

    First we need to consider the audience, and provide correct context. For a developer the audience usually falls into the groups that surround you on the day to day:

    • An operations person deploying and monitoring the software
    • A developer needing to configure the software and environment around it to add features
    • A quality engineer testing the the features
    • A product manager understanding cost and limitations to figure out what can go on the road-map.
    • A user, using your application to solve a specific problem.

    Context

    When illustrating the solution, the context would be the problems the solution is solving. Good communication takes into account the audience and what they need to get out of it. If I'm testing software, I don't need a lengthy explanation of the computer science behind an algorithm, but if I am deploying and monitoring the software this would come in handy in explaining why some operations take a long time or use a lot of resources.

    Who is the audience for the design and how does an Architect deliver the design to them?
    In a software project or product, it essential to formulate the design for the audience and in a software there are 3 groups that consume the design to make the final deliverable.

    Product and Engineering delivering Solutions
    • Use the Quality Model (https://en.wikipedia.org/wiki/ISO/IEC_9126)
    • Understand and define the value of the software in the context of the experience of the audience using it. This defines how the functionality and aesthetics combine to deliver value.

    Engineers validating, deploying and monitoring Applications
    • Use the 12-factor model (https://12factor.net/)
    • Validate the design and operation of the software on the machine for the user so its structurally sound

    Programmers building Containers and Components
    • One current approach for expressing structure that I have got a lot of value from is the 'C4' model (https://c4model.com/) This separate the Context, Containers, Components and Classes.
    • Build the software in the context of the users and machine so its functional

    What is Software architecture?

    The architecture of software is explained in many ways, sometimes verbally, but as it grows the complexity of the system is such that some formalized description is needed. This can be done with a whiteboard, or a diagramming tool that makes boxes, lines and whatever the author sees as necessary to describe what the system is.

    In any case, it is really important to write things down and operate for a common source of truth

    This is the structure of software, and is valuable on its own, but its incomplete until you have been able to describe the behavior of that structure in a given context.

    Structure

    While hand-waving and white-boarding got us far, there was an understanding early on that a standardized language would help. This became the Unified Modelling Language and has given a good foundation to formalizing architecture. Like any standard, it is only as good as its acceptance as a standard. So use UML pragmatically, I have seen pictures of hand-drawing on a whiteboard go quite far!

    Behavior

    The behavior is what the structure does with the inputs it is given. This is really what the users do and what you test for.
    These can be problems from the user, the domain, or constraints from the environment.

    Using a format like BDD really helps formalize the behaviour so you can build and test with a more accurate understand of what the user will do
    • user scenarios (understand the problem from user perspective, functionality and usability)
    • domain requirements (validate solutions with domain/business rules, functionality and domain rules)
    • environment constraints (cross-cutting concerns that all features have, context that feature share)

    Just enough architecture...

    When thinking about architecture, its natural to think of architecture in other technologies. Houses, Cars, and Reactors need architecture; and for those technologies the architecture is an output that is needed before any building happens.

    Software development doesn't have this constraint, you can just start building and the architecture will just happen by consequence. While this is possible, and done quite often in a fast moving startup; its a symptom of a problem that will manifest itself eventually. By not defining the architecture along the way the design of the system grows organically; and like a garden that never gets weeded this will grow in complexity until its out of control.

    Design often, but only when needed; and keep it consistent.


    Packaging code for development and deployment

    Packaging application code for deployment


    For any programming environment, the application or library you are making will have dependencies. Different libraries and different components are used to fulfill the responsibilities of the application functionality. 
    This article is a bit of a comparison of how different languages and environment package all the dependencies so the software can run on the target machine, or environment its intended to be hosted in.

    When developing the application, it can be an easy win to include the library in your IDE or just on your machine in a global environment setting. This sort of pattern really makes sure that the app is only going to work as expected on your machine; rather than being portable enough to work consistently in the QA, staging and production. Save yourself the hassle and setup package management so that you have a good foundation that will work in different environments.

    Goal: portable, consistent builds and deployments

    How to do this is to make your workflow deterministic. This means that the operation has the same result when run many times and run in many different places. For package management in your build this means ensuring that the versions of the various dependent libraries are the same. You need to lock them down so you can guarantee consistency.

    So, for any language and environment the goals are similar:
    1. make the build deterministic.
    2. the environment setup when developing should be straightforward
    3. aligned versions of the dependent packages.
    4. enable portability and consistency between environments
      1. Your IDE can help, but don't stop there. make sure it works in a different environment to call it done. Continuous integration tools are great for this.
    So much of this depends on the environment. Is this an interpreted language that runs on a virtual machine, or compiled to object code that is specific to the machine? With these concepts, let us use some different programming environments to see it all in action. We will compare python, java, node and C++.

    python 


    With python, you have dependent libraries that are managed by a program called pip, and in the requirements.txt file you should see the various packages that are required by the application. You also have a global environment, and these can work together, but you have to keep tabs on what is deployed in what environment. This gets dicey quickly

    pyenv

    Pyenv allows you to set a particular python version globally on your system, or to a specific local directory. This is handy when working on different projects that are based on different versions

    virtualenv

    To isolate the packages just used by your application and to be sure you are just using a specific set of packages, a virtual environment is a great tool. This basically makes a sandbox on your machine and only uses the packages contained therein.

    environment variables

    So, now that the packages are contained, the run time behavior of the application may need to use environment variables. This introduces the same problem of the dependent packages.

    So, with these moving parts it pretty easy to see how that can get out of hand. It would be great to have a tool that both manages packages, and environment variables so you have a consistent experience.

    Pipenv 

    It turns out there is a great tool for this: pipenv. Pipenv will organize your packages and variables in one single environment, but also take a snapshot of all this in one file: Pipfile.lock. When this file is present with all the settings contained, you are able to deploy to another environment using pipenv. This is a great piece of software, and comes from the same author as Requests. He knows his stuff! (https://www.kennethreitz.org/projects)

    Java

    In java the environment is the java virtual machine; so the settings for anything specific to the hardware isn't really something you need to care about. Packaging dependencies and their versions are, and there are two tools in regular use for package management: maven and gradle.

    Building/compiling

    Since you are compiling to bytecode the results are class files, and those are zipped up into a jar file (or not) so the virtual machine can run the portable byte code on the target machine.
    maven

    Packaging

    Maven is a XML based declarative language for dependencies and build management.

    Gradle was created to use a nicer language (groovy) and it builds a better dependency graph will maven is really just a set of jobs that you need to put in the same order. If you use maven and your maven file is bigger than 500 lines, it might be a good idea to try gradle. Gradle is also s nice intro to the groovy language, and with different tools (jenkins) using groovy; its worth the time to learn it.


    C++

    C, D, and C++ compile to the actual target machine, so your dependency management will change depending on the OS you are using. Dll files in windows, and .so files in linux are used to package libraries.

    Node

    Have you ever grown a garden? If the garden grows, you get some vegetables and some weeds. It happens, but when you pick your vegetables do you take the weeds out, or just make a new garden bed and double down on your fun? If you like option #2 then you will love npm!

    Kidding aside, npm is just a victim of its own success. The growth of javascript on the server-side came on so quickly that without a strict overseer of the project it just became what it is today.

    Building

    Yarn is a more stable binary in my experience (at the time of writing)

    Packaging

    Use package-lock.json to set your versions appropriately. If not, you just get what is on the host machine and things get wild pretty quickly. To get even better results try the shrikwrap package:
    https://docs.npmjs.com/cli/shrinkwrap