I probably speak on behalf of many developers when I say that I truly miss the days when we used a single layered desktop application code in which we could just insert a breakpoint wherever we wanted the debugger to stop, enabling us to simply debug all the code in one shot with one tool. While it actually wasn't that long ago, debugging in the "old days" was fairly straight forward.
Then the web came along and made us work harder. We were faced with browser client codes and server codes running on different platforms, causing us to use multiple debuggers to debug the same application flow. When AJAX was thrown into the mix, debugging became even more difficult because now we had client events and server events and multiple asynchronous calls per application. Debugging every AJAX application becomes as challenging as debugging a multi-threaded server.
With all the advances in technology, does debugging web applications have to be that complex? In this article, we'll examine the history of debugging and the options we had. A following article will examine a new trend in debugging and an alternative option.
Is "old school" debugging gone?
Not long ago, debugging an application wasn't such a big deal. You just had to open the source code project in your IDE of choice, open the file with the subject part of the code to debug, and then debug- run your application using step-by-step debugging.
Historically, people wrote large amounts of Assembly language
So we end up with 2-5 separate debugging tools and a sniffer trying to track your code around. Even assuming that are mastering all of those tools and can orchestrate all of them together, we can probably agree that the old method was much more productive.
We can have a different perspective if we think of the web environment as one that is evolving. Like the evolution of desktop apps development, which ended up in VB 6.0 but prior to that we had to code Windows applications with extremely complex paradigms such as MFC that only a few of us really mastered.
History repeats itself
The evolution of software development teaches us that new programming languages made development simpler only by abstraction on existing paradigms. Before I explain this further, let's agree to the premise that development simplicity includes debugging and testing simplicity. That is because having simple code built out of larger blocks results in a much more simple to debug application than a low level, complicated code. Furthermore, each development environment involved in the development process should bring its own debugging paradigm and the orchestration task of a few debugging environments is not obvious.
So, if we look at the Assembly language development, except for some minor tweaks in the evolution of the Assembly language, nothing really changed. However, the environments evolved and provided better tools to develop and debug (from vendors like Borland and others).
Pascal, C and other languages in the same generation are practically an abstraction on top of Assembly coding languages; more specifically, the C language left unhandled some development complexities such as Array boundaries, Allocation and disposal issues, free style pointers and many more too-strong development weapons that in many cases where aimed back at the developer. Some languages in this generation independently addressed those issues proposing protections within the language.
Then we had an excellent level Borland C IDE which made it easier to code, track and debug C code; until then, we used plain text files and a free style compiler. Visual C was also there to provide access to WIN32 methods much easier than before. C++, which is still the object oriented replacement of C, made it possible to encapsulate methods and data providing the developer with a more building blocks oriented environment on which a reusable infrastructure can be built. Almost at the same time, technology advanced a bit more and provided Windows UI, which began to move developers from coding command line applications to a Windows Applications development. The tools for desktop development required highly trained and sophisticated developers and involved a complex methodology of debugging, which included non-debug-able WIN32 API methods along with multi-layered projects structure (i.e. Win-Projects, MFC and Borland C++).
Visual Basic was quite a revolution in desktop development, providing desktop development capabilities to anyone who had some development knowledge. This is due to a single layered, single language development, a very organized projects and files structure which derived a corresponding intuitive debugging methods.
Visual Basic handled the "mess" of desktop development, providing an organized events handling mechanism on top of the highly complicated WIN32 messaging environment where every entity is a window, saving the developer the hassle of maintaining multi-layered and large number of files and projects, and enabling the single layered debugging experience. Among other things, VB provided the first truly easy to use WISIWYG designer, which allowed locating visual controls as simple as drag and drop.
The same kind of complex environment exists today in web development, only now, it's has worsened since the runtime environment has multiple browsers and other devices. In order to develop web apps, you should master coding and debugging of the following:
- .NET or other server based languages (JAVA JSP & J2EE)
- Ajax development - full understanding of distributed A sync behaviors
- Multi-threaded server debugging (due to Ajax)
- CSS coding and debugging (within all kinds of browsers)
- XSLT coding and debugging (in those cases were UI is transferred through XSLT – and within all kinds of browsers)
With all the development in technology, there has to be some kind of alternative. Where is the trend moving towards? Is there a way to do single layer web development and debugging?
This was first published in January 2010