Here's a neat utility for constant bloggers who want to drive more traffic to their sites.
Available for both Windows and Mac OS
BlogSigs can be configured to work with Outlook, Gmail, Yahoo! Mail, and Hotmail, and it adds a link to your latest blog post in your email signature. It's definitely much better than just putting the standard link to your site's main URL!
This has been a very exiting week for virtualization, at least on the Microsoft side of things. Microsoft made two significant anouncements between Monday and today:
- First of, Virtual Server R2 2005 SP1 was finally released! You can download it from Virtual Server’s website. Make sure you check out the details about the release as well. BTW, the website also got a new “modern” look. Nice job!
- Also, today Microsoft released the whitepaper “Licensing Microsoft Server Products with Microsoft Virtual Server and Other Virtual Machine Technologies”. This whitepaper details the licensing requirements to run Microsoft’s products in Virtual Server, VMWare, and other virtualization solutions. It also covers some restrictions for scenarios like moving virtual machines from one server to another, virtual machine libraries, products licensed by CPU/Virtual CPU, and others.
For the licensing, remember that you can also use Windows Server Virtualization Calculators to figure out the licensing cost of running Microsoft’s server products on a virtual environment.
By Migration 2.0 (coined in the context of MacAfee’s Enterprise 2.0 vision) we
want to consider additional ways of addressing Automated Software Modernization
in a Web 2.0 context and why or how this should or not deserve some particular
attention. As a complement of a previous post, we have developed a presentation (albeit
in Spanish) which may help to clarify (or not) some of the ideas around the
notion. Especially relevant are a user-centric notion of migration and
modernization as a sort of “SLATES-ification” of legacy code where legacy need to be understood inside of the 2.0 context. We will revisit and refine this
idea in a forthcoming post.
You will find the slides of the presentation here.
Localize a VB6 application can be cumbersome, specially if it was not even originally planned to be localized.
Nowadays is common that you're clients might demand support for different languages.
While a localization task is still difficult, we have found excellent results performing it during a VB Migration.
Our tools allow us to easily externalize all your strings. After that the traslation task becomes easy, and you can even use the help
of nice projets like
Automatically Translate your .NET resource files with Google Translate
The ArtinSoft migration guide to VB Upgrades and Conversions is becoming a huge success. It is clear that people are consulting it. To me, this is just another evidence that the movement of VB6 applications to .NET is happening!
Link to Upgrading VB6 to .NET – migration guide FAQ
Your comments are more than welcome!
Let us elaborate some loose ideas on a fresher notion of
“software modernization”. In the sequel, we are reviewing the notion of
software migration as a modernization path and we want to look it through the
glass of other contextual elements and events around the IT world which have
evolved during the last 2-3 years. Such elements give us a reason to identify
forces and opportunities on innovation and possible new developments. Let’s
assume, we now have focused more on an implementation oriented modernization;
namely one that is mainly concerned with programming languages,
frameworks, platforms, architectures and the like those have being prevailing
during the last years.
From a technical point of view, we are aware this implementation
based notion is completely valid and will exist for a while as such because still
useful legacy systems are forced to evolve at the implementation level while retaining
as much as possible its original functionality and corresponding business
value, at a reasonable cost. However, we also are aware that the environment
where ISs serve and survive is so strongly evolving in such a way that
implementation details are probably remaining that important only at a
traditional IT/IS level and vision. What kind of environment and forces are
these making pressures on that vision? Is there an opportunity there for
migration?
One important phenomena is definitively the Web 2.0 and, in
analogy to how Internet forced Intranets, Andrew McAfee has recently coined the
Enterprise 2.0 concept which embodies those well-known effects the Web 2.0 as
an ubiquitous trend, as a social movement and how those might be pushing on at the
inside of the IT enterprise nowadays; and as a direct consequence at the kind
of tools employees might be willing
and needing in their regular work environment; those where new information
requirements born faster than they can be incorporated as new features at the
traditional IS platform. Whether so-called Web 2.0 tools will be improving
employee productivity is probably an open and debatable question, no doubt
about it. We still remember not too long ago how e-mail and Internet at the
work was considered as a disturbance source per se.
True is also, however, that being able to search, analyze, discover,
tag, publish, share and trace knowledge at the rhythm of the business and own personal
information needs has become now more important than ever. And traditionally designed
ISs could be becoming a factor contributing to a sort of impedance mismatch
between the huge flexibility and freedom that persons might currently encounter
on the Web (even in private personal milieu) and a rigid traditional IS
platform at the work-place. And, we emphasize, this concern occurs independently
of whether or not such IS platform is “modern” at the implementation level
which is another different dimension of the matter.
As Dion Hinchcliffe entitles, Enterprise 2.0 is a cultural
catalyst (as Web 2.0 is being); we believe and interpret it as a realistic vision
where, more soon than later, ISs will be judged in terms of McAfee’s SLATES
criteria and this will entail a rather stronger force leading to modernization of
higher–level then than the technical one, because such criteria are more
closely related social, common-sense, better understood forces not so directly
related with technical issues.
If such a vision is accepted as a legitimate opportunity, we might
then be looking for spaces for innovation now when we are considering moving to
the next levels of automated supported migration. In such a case we have to consider migration as a
path enabling ISs (not just legacy ones) to evolve into direction Enterprise 2.0-like modernization
as a well-defined strategy.
Here’s another time-saving tip. If you select Windows Authentication when installing SQL Server Express 2005, and you need to change it so it allows SQL Server Authentication later on, all you need to do is change a registry key. To do this, shutdown all SQL Server-related services, run regedit32, and in the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\MSSQLServer, change the value of LoginMode to 2.
Restart all services, and that should do the trick. You can get more information in this KB article (talks about MSDE/SQL Server 2000, but it also applies to SQL Server Express).
Here’s a tip that should save you some time (I spent several hours trying to figure this one out). When you deploy a .WIM image with Windows Server 2003 using WDS, you can have that machine take on the name you gave to it in Active Directory (when prestaging the computer). To achieve this, do the following:
- Create a copy of the sysprep.inf file on the folder c:\RemoteInstall\Images\\\$OEM$\$1\Sysprep
- Edit this sysprep.inf file and make sure you have the following lines in place:
[UserData]
ComputerName="%MACHINENAME%"
…
[Identification]
DoOldStyleDomainJoin=Yes
- Remove the file c:\Sysprep\sysprep.inf from the image (if necessary, mount it with imagex)
- The next time you re-image the machine, the WDS client will place this sysprep.inf file inside the image, and will grab the name you gave to the machine in Active Directory when you prestaged it.
Everyday I have a loft of stuff to write about on my Blog, but as you can see, it doesn’t always happen. Now it is time to do get back on track!
As a member of the team of open source project BlogEngine.net and its first release just out the door, now it's time to tell my story.
Some time ago I wrote about .Net slave Blog and its good stuff. After that, Mads contacted me about my post just to share a couple of gentle words.
Then I found out that Mads wants to create a new Blog engine with some really cool features in mind, specially:
- Written entirely in C# and ASP.NET 2.0
- Small in size and source files
- Plug 'n play implementation (just copy to web server)
- No third-party assemblies
- Using ASP.NET themes and skins
- Easy to extend using plug-ins
- Many more…
Thinking about developing a new Blog engine these days is a risky thing to consider. When you already have a bunch of well done and well tested solutions, along with good development teams, starting from scratch is a thing that should be considered over and over again.
How did I get involve and why?
I got hooked up immediately after reading the simple specs I mentioned above. Then I checked the first bits of BlogEngine and liked the approach so fast that I immediately started wondering to myself about getting involved. I contacted Mads and got hooked. I also want to mention here that when Mads told me about moving the project to Codeplex and use team server Explorer, believe me, I was super skeptical. To my surprise, I can now tell that using both Codeplex and Team Foundation Explorer has been a really great experience.
Why use BlogEngine.Net?
Because, after trying many other Blog engines out there - developed with ASP.NET of course – you can easily find some problems. Or maybe not exactly problems but different implementation methods for different situations, in a way that is so complicated you end up disliking the engines because of the lack of SIMPLICITY. I tried some engines like .Text, Subtext, DasBlog, Community Server and others. Those platforms are really good, but maybe in a context where you don’t have to extend, because when you think about extensibility or customization, you can get into a non ending trouble, especially when you want to do something simple and end up learning a framework/platform that, in some cases, you have to debug through hundreds of classes just to get used to the code and understand how things work. For some people like me that like full control, even when you did not write the software, this is just not an option. No matter if I’m on the team or not, I will use BlogEngine.Net because it has all the ingredients an ASP.NET web developer loves when dealing with a Blog engine: easy to setup, easy to extend and easy to understand.
Now celebrate the first release
Today, I want to join Mads making the first official release of BlogEngine.Net. Go visit the BlogEngine.Net website and have a look at project and why not, setup your next Blog with it. Congratulations to all the BlogEngine.net Team for the first release. Cheers!
We speculate software and data migration are frequently urged
by IT infrastructure modernization needs (emergencies would it be more precise
to say perhaps) at the enterprise level which often are nearly related with platform
obsolescence and associated maintenance costs. A possibly even more appreciable
benefit might be the need of integration between heterogeneous information
systems and data bases, the need of being able to easily connect information
sources, to create access ports to facilitate extraction and derivation of
knowledge from such sources helping at the management level at the decision-support
systems level in a more natural and easy way. In other words, the real value of
modernization would be to support BI-like strata pointing -for sure- to a
better CRM platform among others. We feel migration is probably not perceived as
a transitory goal, more as a mean, so the added-value might lie outside possibly hidden
with respect to the whole migration effort per se. It seems to me that the vertical
road which leads from software migration and modernization until reaching BI-end-user
is perceived as a too long one, one that is hard to appreciate at first sight
at the management level. My question to myself would be whether that has to be
so. Shouldn’t migration tools and message be pointing as layers enablers for
the BI layer, in a more directly way. And if the answer is yes we should be
asking ourselves how such migration tools can be adapted for closing the gap.
I was taken a look at some sources showing trends on BI for 2007, especially related to legacy code. I have found interesting the following quotes from a report Knightsbridge Solutions LLC (Trends in Business Intelligence for 2007)
Trend #5
Service-Oriented Architecture: Information Management is Critical to Success (of BI)
The buzz around service-oriented architecture (SOA) continues, with some organizations viewing SOA
as the solution to a wide range of business and technology problems, from improving enterprise agility
to deriving more value from legacy systems (I think we are wintness of that).
In the BI arena, SOA has great potential for delivering
enhanced capabilities to users. An SOA-enabled BI infrastructure could provide seamless access to
both batch and real-time data integrated across operational and analytical sources. SOA also presents
opportunities for innovation in areas such as real-time data collection and real-time analytic services.
However, companies that approach SOA without a strong information management methodology will
have difficulty achieving the benefits they seek. When implementing SOA on a large scale, companies will
face the same barriers they do in large BI integration projects. For example, some early adopters of SOA
found that semantic incompatibilities in the data between disparate systems are a significant impediment
to SOA adoption. These organizations are discovering that master data management and data governance
efforts must precede SOA adoption to provide a “common language” that supports integration. SOA has
the potential to deliver benefits in BI and many other areas, but not without a solid information management
foundation.And also
Trend #8
Influence of Large Vendors: Market Consolidation Expected in 2007
Speculation abounded in 2006 regarding potential acquisitions of pure-play vendors in the BI space.
Business Objects, Cognos, Hyperion, and Informatica were seen as potential acquisition targets with
likely acquirers being enterprise software and infrastructure vendors like IBM, Microsoft, Oracle,
and SAP. Large vendors have moved aggressively into the BI space, building their capabilities through
both acquisitions and internal development (as evidenced by Hewlett-Packard’s recent acquisition
of Knightsbridge).
Strongly consistent with this
http://www.intelligententerprise.com/channels/bi/showArticle.jhtml;jsessionid=ADNU0C0TP01BYQSNDLRSKH0CJUNN2JVN?articleID=199501674
This is also interesting to see in more detail
http://www.spagoworld.org/ecm/faces/public/guest/home/solutions/spago
"Naja" let's see
A quick note – I just noticed I recently went over the
100-post mark here in my blog... yoohoo!! I never thought I would get this far with it. I have to accept that it took me a little over a year to make it, but I finally did it. Hooray!!
Right now my colleague
Stephen is delivering the last hands-on lab of the
Virtualization for Developers Lab Series. This means that one of the most interesting trainings I have delivered is now over. It has been a good run, and, even though we sometimes didn’t get the attendance we wanted, a great experience. Some of the highlights of the series include:
- Meeting all sorts of interesting people with interesting (and crazy) projects at every location
- The experience of getting the setup process for the labs almost fully automated - learned a lot about Windows in the process
- See Windows Server Virtualization live for the first time on a presentation by Arno Mihm at a Redmond event
- Going to a tapas bar (“de tapeo”) in Huesca with some of the attendees at the Zaragoza event
For the next few months we’re going to be working on some new trainings and in some other exciting projects. I’ll keep you all posted. In the meantime, remember about the HP Integrity labs – that’s where I’ll probably head next!
You can specify the rom boot program to use with a specific machine when using WDS. This allows you to set the machines to always try a PXE boot first, and control its behavior from WDS. WDS comes with three x86 boot roms, each with different functionality (this also applies to x64 and Itanium – since I only have x86 machines available for testing, I’ll use those roms):
- Boot\x86\pxeboot.com: Normal boot ROM. It presents the prompt for F12 and boots from WDS ONLY if F12 is pressed
- Boot\x86\pxeboot.n12: Boots directly from the network (PXE) without waiting for F12 to be pressed
- Boot\x86\abortpxe.com: Aborts the PXE boot process and continues booting from the next boot device
In order to change the boot rom, you need to have the machine pre-staged in Active Directory, and use the wdsutil command line program. You also need to know either the name or the MAC address of the machine. With that information, you can issue the command:
wdsutil.exe /set-device /Device:SERVER01 /BootProgram:Boot\x86\pxeboot.n12
OR
wdsutil.exe /set-device /ID:AA-BB-CC-DD-EE-FF /BootProgram:Boot\x86\pxeboot.n12
The previous lines change the boot rom to pxeboot.n12 of the machine SERVER01.
By using this, when you have to restore an image to a machine, you can just change the boot rom to pxeboot.n12, and the process will take place automatically (depending on how you have it configured). Once it is restored, you can change it back to abortpxe.com, and it will continue booting from the hard drive every time.
Sometimes
negative experiences may turn out positive in the long-run, everybody should
hope to believe in whenever things do not happen the way we expected. Though already
old, maybe questionable and controversial, I must realize this speech made me reflect a
lot about it, I found it very rich in philosophical material; it changed me
somehow. If you haven’t yet, you could find interesting to watch it
Video: http://www.youtube.com/watch?v=D1R-jKKp3NA
Text: http://news-service.stanford.edu/news/2005/june15/jobs-061505.html
Next month we’ll be at the last planned HP Integrity Developer Workshop, in East Rutherford, NJ, between June 12-14. We’ll be in charge of the Windows 64–bit track.
This is the last workshop planned for this year, so make sure you take this opportunity to get hands-on training with HP’s Integrity Servers using Dual-core Itanium CPUs. You can choose to receive training in either Windows, Linux, HP/UX or OpenVMS, and get help with your ports to the Itanium platform. Plus you get to take back home the server you worked on! Check out the benefits from the class (from the Workshop Overview):
- your dual-core Itanium-based application porting efforts well underway or in many cases, completed.
- The HP Integrity rx2620 server that you used in the classroom sent directly to you from the workshop for your continued porting and testing efforts.
- free software development tools
- membership in HP's Developer & Solution Partner Program that allows you to take advantage of GTM and lead generation programs once your port is completed.
Make sure you reserve your spot for the workshop!
There are a list of situations you might want to handle with Active Directory:
525 - user not found
52e - invalid credentials
530 - not permitted to logon at this time
532 - password expired
533 - account disabled
701 - account expired
773 - user must reset password
This is an extract of the Java Forum to handle these cases. Good Luck!
} catch (AuthenticationException e) {
String tempString;
StringTokenizer tokenizerTemp = new StringTokenizer(e.toString());
while (tokenizerTemp.hasMoreElements()) {
tempString = tokenizerTemp.nextToken();
if (tempString.equalsIgnoreCase("AcceptSecurityContext")) {
while (tokenizerTemp.hasMoreElements()) {
tempString = tokenizerTemp.nextToken();
if (tempString.startsWith("773"))
setIsPasswordExpired(true);
if (tempString.startsWith("52e"))
setIsPasswordWrong(true);
if (tempString.startsWith("533"))
setIsAccountDisabled(true);
}
}
}
throw new NamingException();
}
It is common that after a migration to Java, specially coming from legacy platforms like LINC or COBOL, that our clients want to take advantage of new technologies. So it happens that they are now authenticating against an Active Directory or another LDAP server. And thanks to the new platforms it is really easy for us to help them integrate this new functionality.
This is sample program that show how to authenticate with for example a Windows Active Directory.
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.Hashtable;
import javax.naming.Context;
import javax.naming.NamingEnumeration;
import javax.naming.NamingException;
import javax.naming.directory.Attributes;
import javax.naming.directory.SearchControls;
import javax.naming.directory.SearchResult;
import javax.naming.ldap.InitialLdapContext;
import javax.naming.ldap.LdapContext;
public class LDAPTest
{
static class LDAP
{
static String ATTRIBUTE_FOR_USER = "sAMAccountName";
public Attributes authenticateUser(String username, String password, String _domain, String host, String dn)
{
String returnedAtts[] ={ "sn", "givenName", "mail" };
String searchFilter = "(&(objectClass=user)(" + ATTRIBUTE_FOR_USER + "=" + username + "))";
//Create the search controls
SearchControls searchCtls = new SearchControls();
searchCtls.setReturningAttributes(returnedAtts);
//Specify the search scope
searchCtls.setSearchScope(SearchControls.SUBTREE_SCOPE);
String searchBase = dn;
Hashtable environment = new Hashtable();
environment.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
//Using starndard Port, check your instalation
environment.put(Context.PROVIDER_URL, "ldap://" + host + ":389");
environment.put(Context.SECURITY_AUTHENTICATION, "simple");
environment.put(Context.SECURITY_PRINCIPAL, username + "@" + _domain);
environment.put(Context.SECURITY_CREDENTIALS, password);
LdapContext ctxGC = null;
try
{
ctxGC = new InitialLdapContext(environment, null);
// Search for objects in the GC using the filter
NamingEnumeration answer = ctxGC.search(searchBase, searchFilter, searchCtls);
while (answer.hasMoreElements())
{
SearchResult sr = (SearchResult)answer.next();
Attributes attrs = sr.getAttributes();
if (attrs != null)
{
return attrs;
}
}
}
catch (NamingException e)
{
System.out.println("Just reporting error");
e.printStackTrace();
}
return null;
}
}
public static void main(String[] args) throws Exception
{
InputStreamReader converter = new InputStreamReader(System.in);
BufferedReader in = new BufferedReader(converter);
System.out.println("Please type username:");
String username = in.readLine();
System.out.println("Please type password:");
String password = in.readLine();
LDAP ldap = new LDAP();
//Yo specify in the authenticate user the attributes that you want returned
//Some companies use standard attributes like 'description' to hold an employee ID
//The ActiveDirectory data can be enhanced to add custom attributes like
//printer
// Some instalations usually have several ACtiveDirectoryServers, lets say
// 192.150.0.8, 192.150.0.7 y 192.150.0.9 and they use a
// DNS round robin to balance the load
Attributes att = ldap.authenticateUser(username, password, "mydomain.com", "myactivedirectoryhost.com", "DC=mydomain,DC=com");
if (att == null)
{
System.out.println("Sorry your use is invalid or password incorrect");
}
else
{
String s = att.get("givenName").toString();
System.out.println("GIVEN NAME=" + s);
}
}
}
Just wanted to share a nice
tool I found to create quick links at blog posts. Developed by Laurence Gonzalves using the Google AJAX Search API, this is perfect for those constant bloggers who make extensive use of hyperlink tags.
Today I finally got a chance to play with Beta 2 of the System Center Virtual Machine Manager. Here are my preliminary impressions:
- First of all, the product has improved significantly. It has tons of new features, and uses tons of new pre-requisites. Fortunately, you can download a pre-configured VHD with SCVMM, ready to go, from Microsoft Connect.
- This version of SCVMM uses the new Windows Remote Management (WinRM) package to manage remote servers. This is a step in the right direction, IMHO, since it is Microsoft’s implementation of the WS-Management Protocol. The downside is that I had to install the WinRM package on the servers, but well, its a nice trade-off for getting a SOAP-based, standard management product.
- One thing I really like about it is the Powershell integration. It is finally included in this version of SCVMM, and the implementation rocks – at the end of most wizard, you get a button that says “View Powershell Script”:

Pressing that button shows you the Powershell script equivalent to the options you selected:

So far I’ve only been able to add hosts and manage the virtual machines on those hosts. I am looking forward to working with the new advanced features, including (and especially) the physical to virtual migration. Overall I think the new features are great, and you should give it a try.