Ehsan Ghanbari

Experience, DotNet, Solutions

Software development and butterfly effect

If you have developed a large scale system, you have seen that small mistaken changes at one place especially at the earlier steps have damaged the other parts in the future of the system. I'm not going to talk about a solution for it, Actually, there is no any way to stop doing this changes and their effects. But you can reduce the amount of these unwilling issues by some technique.

These little changes at the first steps or maybe at a specific part of software development is called butterfly effect which is discussed in chaos theory. Based on chaos theory small change anywhere in the system (requirement, analysis, design, code, testing) can cause a similar change nearby and that will another similar change. Maybe you have heard that: "Small variation in the initial condition of a dynamic system may produce large variation in the long-term behavior of the system", so it's worth to sit and analyze, design completely and then get start coding to reduce the butterfly effect in the future of your system.

These phenomena of sensitive dependence on initial conditions are known in physics as the butterfly effect. Errors in early collaboration, requirements, and design are tenfold as expensive to fix in the coding stage and it will be more expensive in the future. In general early analysis about requirements at first (agile manifesto!), exact estimation of the project's scope, managing the development team(conventions and collaborations) are all the necessary to reduce the risky changes of a project.


Convention over configuration

Developers only need to specify unconventional aspects of the application and worry only about the unconventional parts of the application and architecture. Convention over configuration is a software design principle, philosophy, and technique implied from the structure of the code instead of requiring explicit code. letting the code "just figure it out" from using naming conventions instead of explicit code or finding ways to avoid duplicating information in the system. Developers don't need to use a particular methodology or approach while that approach of a methodology is true. During the development process, lots of events could occur, so you should follow some conventions and know them. But be careful that Too much convention can make code confusing and make the system difficult to understand.

Optimistic and pessimistic concurrency control

Optimistic and pessimistic concurrency control AKA optimistic and pessimistic locking is a mechanism to preserve database integrity in multi-user applications. Optimistic and pessimistic are two kinds of locking control. In general concurrency control ensures that correct results for operations are generated. Perhaps you are familiar with locking to serialize the data for accessing shared data; for example when several people in a system are working on the same data, what will happen if they save the changes at the same time?? Imagine this condition about hundreds or thousands of users! I heard about Optimistic and pessimistic locking in "Patterns of Enterprise application" book by Martin Fowler, let's consider them separately.


Optimistic Concurrency: it is based on the conflict and transaction restart. Optimistic concurrency looks a little bit optimist when two operations occur simultaneously. for example, when two users updating the same data, updates will be discarded and the user will be informed later and the system assumes that there will be no conflict but it will be checked again to save the future changes on the target data. Optimistic allows transactions to execute without locking any resources and it will just be checked later and  If a conflict will occur, the application must read the data and attempt the changes again.


Pessimistic Concurrency: it uses locking as the basic serialization mechanism. it is called "Pessimistic" because you can imagine it in the worst condition. When two or more people are making changes in the same data, the lock will be placed and prevents the possibility of conflicting. Pessimistic concurrency control locks resources as they are required, for the duration of a transaction. In pessimistic approach system assumes that some people will try to make a change in the same data at the same time, so system will not let all of them make change at the same time, it will be done one request after another and no other thread or user can access the item while it is locked.



More information

  1. Patterns of enterprise application by martin fowler

What is Subversion?

It's about 12:46 am in 8/5/2013, I don’t know anything about Subversion, there is 6 open tab in my browser to discover what is Subversion for. I'll share it in an hour!

What's Subversion?

Subversion is all about managing and tracking the changes and overall it is an advanced version control. Its tools are so useful for versioning, changes of files and web pages and any document of the software. The main idea behind Subversion is to prevent programmers to work on the same file in group working. Subversion created in 2000 and its tools are usually open source. Subversion is something different from continuous integration because Subversion cares about the directories and files it is supposed to track the changes to.


What's the need for it?

Software development is an evolving progress. It could be changed several times in the developing process, it could be divided into several versions and so on. All of this need to be keep stacking. As I mentioned, Subversion introduced to prevent programmers to work on the same files and overwriting each other's codes. Subversion makes the developers be sure about Version control and don't wast any time about that.


You can see the useful resources I used to know about Subversion below







A Short talk about Continues Integration

CI(Continues integration) is all about merging the developers working frequently, merging could be one or several times a day. Before integrating developers should build, test and run the code successfully. The main Idea behind CI is to prevent integration issues. Best CI implementation is when it contains test Driven development. In a teamwork, everyone should work with the latest version of the system, so in any case, every member of the team must commit the latest changes daily. Committing regularly reduce the conflicting changes and everyone can see the result of the latest changes and also everyone can see that who has made the changes and why. In this case, if there would be an error in the system, it could be detectible easier. Loner period of time between integration makes finding and solving the errors and bugs so hard and complex.

        “Continuous Integration doesn’t get rid of bugs, but it does make them dramatically easier to find and remove.” ~ Martin Fowler

                      Continues integration



As you can see them in the picture above, developer and tester receive the feedback and the latest version of the software after committing the changes. integration is a long and unpredictable process and should be done continuously. the most advantage of CI is that it reduces the risks, imagine in a large-scale application developer want to merge their changes after working a week on it, I think they would spend another week to fix bugs and errors and incompatible parts, so traditional ways could be risky solutions.


Refer to these addresses to get complete result


About Me

Ehsan Ghanbari

Hi! my name is Ehsan. I'm a developer, passionate technologist, and fan of clean code. I'm interested in enterprise and large-scale applications architecture and design patterns and I'm spending a lot of my time on architecture subject. Since 2008, I've been as a developer for companies and organizations and I've been focusing on Microsoft ecosystem all the time. During the&nb Read More

Post Tags
Pending Blog Posts
Strategic design
Factory Pattern
time out pattern in ajax
Selectors in Jquery
Peridic pattern
How to query over Icollection<> of a type with linq
How to use PagedList In MVC
Using Generic type for type casting in F#
Domain driven design VS model driven architecture
What's the DDD-lite?