Naming Conventions and Coding Standards

18. September 2008 03:08 by lleon in General  //  Tags:   //   Comments (0)
ArtinSoft’s top seller product, the Visual Basic Upgrade Companion is daily improved by the Product Department to satisfy the requirements of the currently executed migration projects . The project driven research methodology allows our company to deliver custom solutions to our customers needs, and more importantly, to enhance our products capabilities with all the research done for this purposes. Our company’s largest customer engaged our consulting department requesting for a customization over the VBUC to generate specific naming patterns in the resulting source code. To be more specific, the resulting source code must comply with some specific naming code standards plus a mappings customization for a 3rd party control (FarPoint’s FPSpread). This request pushed ArtinSoft to re-architect the VBUC's renaming engine, which was capable at the moment, to rename user declarations in some scenarios (.NET reserved keywords, collisions and more). The re-architecture consisted in a centralization of the renaming rules into a single-layered engine. Those rules was extracted from the Companion’s parser and mapping files and relocated into a renaming declaration library. The most important change is that the renaming engine now evaluates every declaration instead of only the conflictive ones. This enhanced renaming mechanism generates a new name for each conflictive declaration and returns the unchanged declaration otherwise. The renaming engine can literally “filter” all the declarations and fix possible renaming issues. But the story is not finished here; thanks to our company’s proprietary language technology (Kablok) the renaming engine is completely extensible. Jafet Welsh, from the product development department, is a member of the team who implemented the new renaming engine and the extensibility library, and he explained some details about this technology: “…The extensibility library seamlessly integrates new rules (written in Kablok) into the renaming engine… we described a series of rules for classes, variables, properties and other user declarations to satisfy our customer's code standards using the renaming engine extensibility library… and we plan to add support for a rules-describing mechanism to allow the users to write renaming rules on their own…” ArtinSoft incorporated the renaming engine for the VBUC version 2.1 and for version 2.2 the extensibility library will be completed.

Understanding Software Migration. part 2

3. June 2008 13:35 by lleon in General  //  Tags:   //   Comments (0)

 As mentioned previously, the migration process is now an ally of every company while attempting to get their software systems revamped. It’s imperative to determine the rules to measure the process throughput, in order to compare all the options the market offers for this purpose, but, how it comes to be described the rules to compare a process where every single vendor employs proprietary technology that contrast from one to another?

After eye-witness the whole process, the ideas impressed in the user’s mind will decide the judgment made to some specified migration tool, and how it performs; but to make sure this judgment will be fair, here are some concepts, ideas and guidelines about how the migration process should be done, and the most important, how it should be measured.

 

<!--[if !supportLists]-->·        <!--[endif]-->Time:

Human efforts are precious; computer efforts are arbitrary, disposable and reusable. An automated process can be repeated as many times as necessary, as long as their design considerations allow the algorithms to accept all the possible input values. Migration processes can be done with straight one-on-one transformation rules resulting in poorly mapped items that will need small adjustments, but regardless of the size of those efforts, those must be human, so these single reckless rules may become hundreds of human hours to fix all this small issues; remember, we are dealing with large enterprise software products, meaning that a single peaceable imperfection can replicate million times. Another possible scenario will be complex rules that searches for patterns and complex structures to generate equivalent patterns on the other side, but as many AI tasks, it may take lots of computer efforts, because of the immense and boundless set of calculations needed to analyze the original rules and synthesize new constructions. For the sake of performance, the user must identify which resources are most valuable, the time spent by people fixing what the tool’s output provided; or computers time that will be employed by more complex migration tools to generate more human-like code.

 

<!--[if !supportLists]-->·        <!--[endif]-->Translation equivalence:

Legacy applications were built using the code standards and conventions for the moment, the patterns and strategies used in the past have evolved ones for good other to became obsolete. During an automated software migration process there must be a way to adapt arcade techniques to newer ones; a simple one-on-one translation will generate the same input pattern and the resulting source code will not take advantage of all the new features on the target platform. A brilliant migration tool should detect legacy patterns, analyze its usage and look for a new pattern in the target platform that behaves the same way. Because of the time calculations explained previously, a faster tool will only mean non-detailed and superficial transformations that will be a poor replica of the original code or in the best scenario a code wrapper will fix all the damage done. Functional equivalence is the key to a successful migration, because the whole concept of software migration is not only about getting the software running in the target platform, it’s about adaptation to a new set of capabilities and the actual usage of those capabilities.

 

With that on mind, a comparison between different tools can be clearer now. Leaving aside the competitiveness of the market, the readers should identify the facts from the prevaricated marketing slogans, and appraise the resources to be spent during a migration process. Saving a couple of days of computer time may become hundreds of human hours, which at the end will not cure the faulty core, will just make it run.

Understanding Software Migration. part 1

28. May 2008 10:00 by lleon in General  //  Tags:   //   Comments (0)

Enterprise software is going beyond the line in matters of size and scalability; small companies depend on custom tailored software to manage their business rules, and large enterprises with onsite engineers, deal in a daily basis with the challenge to keep their systems up to date and running with the top edge technology.

In both cases the investment made in software systems to assist a given business is elevated, regardless if it was purchased from another company or if it was built and maintained by the own, it’s never going to stop being critical to update the current systems and platforms.

            Any enterprise software owner/designer/programmer must be aware of the market tendencies of operating systems, web technologies, hardware specs, and software patterns and brands; because of the raging nature of the IT industry it takes an eye blink to get obsolete.

Let’s recap about VB6 to VB.NET era, a transition with a lot of new technology, specs and a lot of new capabilities that promise the programmers to take their applications where it seems to be previously impossible like web services and remote facilities, numerous data providers are accessible with a common interface, and more wonders were presented with the .NET framework, however all this features can get very difficult or near to impossible to get incorporated in legacy applications. At this moment it was mandatory to get that software translated to the new architecture.

Initially the idea was to redesign the entire system using those new features in a natural way but this implicates to consume large amounts of resources and human efforts to recreate every single module, class, form, etc. This process results in a completely new application running over new technology that needs to be tested in the final environment, and that will impact the production performance because it has to be tested in the real business challenges. At the end, we got a new application attempting to copycat the behavior of the old programs and huge amount of resources spent.

Since this practice is exhaustive for the technical resources and for the production metrics, the computer scientists research about the functionally equivalent automated processes were used to create software that is capable to port one application from a given source platform to a different, and possibly upgraded one. During this translation process, the main objective is to use as much inherent constructions as possible in the newly generated code to take advantages of the target technology and to avoid the usage of legacy components. In case that the objective is to include a new feature found in the target platform, the application can be migrated and then the feature can be included more naturally than building communication subprograms to make that new capability to get in touch with the old technology.

This process is widely promising because it grants the creation of a new system based on the previous one, using minimum human efforts by establishing transformation rules to take the source constructions and generate equivalent constructions in the desired technology. Nevertheless, this will require human input, especially in very abstract constructions and user defined items.

All the comparisons done before to measure the benefits between redesign and migration, points to identify the second practice as the most cost-effective and fast, but now another metric becomes crucial. The automated stage is done by computers using proprietary technology depending on the vendor of the migration software, but how extensive the manual changes will be? Or, how hard will be to translate the non-migrated constructions?

 

The quality metrics of the final product will be redefined because a properly designed application will be translated with the same design considerations. This means that a given application will be migrated keeping the main aspects of design and the only changes in the resulting source code will be minor improvements in some language constructions and patterns. This makes the new quality metrics to be: maximize the automation ratio, minimize the amount of manual work needed, generate more maintainable code and reach the testing stage faster.