Tuesday 13 December 2011

Microsoft IT Camp

I was invited to attend a Microsoft test event on Monday 12th December. The Technet staff were trialling a new format of training session and wanted to get some feedback on the format from people within IT, and how people felt it would work if rolled out as part of Microsoft's normal training material. The session was held at Cardinal Place in London; a great venue, very modern with superb facilities but as I’m based down in the South West, this was a long way to travel.

The event was opened by a rather hoarse Simon May who left a lot of the talking to Andrew Fryer. The basic idea was to showcase the updated versions of the System Center products, with a specific emphasis on Virtualisation, making use of Hyper-V. However, they also wanted to focus particularly on the setting up of clusters. I’d seen some previous material on the earlier versions of these products, but was keen to see the 2012 versions due out next year.

For those that don't already know, there has been a move towards much more integration of the various products within the System Center range. Each product is now seen more as an integral part of the overall suite, rather than as a separate product that just happens to work with the others. This seems to a sensible move and it means that sysadmins should have access to all of the tools they need to manage their data centres.

Rather than use high specification equipment, Andrew wanted to demonstrate that it was possible to set-up a test lab using older machines; the sort that can be obtained using ebay or that might be sold off after an equipment refresh. He had several laptops; 2 acting as the Hyper-V hosts and one that was acting as a type of SAN unit. He proposed to join the 2 hyper-V hosts as clusters on a single node.

The presentation did not go quite as planned! He actually ran into several key issues during the set-up, but as many of the people present were very familiar with the product, they were able to highlight a number of the factors that had caused the hiccups. What was interesting was that even with these technical issues, the whole process didn’t actually take that long.

During the day, and also at the end, the staff asked for feedback on the event which it has to be said was generally positive. However, quite a few people (myself included) felt that they had missed a trick; many of us had our laptops with us, and it would have been a really impressive feat to have got these working as part of the set-up as well. There was a general feeling that most delegates would have been more than willing to bring their own equipment, possibly even downloading and installing some items in advance in order to make this more effective.

Having said, they were more than willing to consider this and a couple of other ideas that might allow those present to take a slightly more active and positive role. I’ve seen a couple of VDI infrastructure plans, and I feel that they would easily be able to set-up something that could be used for attendees to connect to and work with VMs in order that they could get a real “hands on” experience.

The plans are for the new format to be modified, based partly on experience but also on the feedback from those that were there. They also hope to develop it further to encompass more topics, and the organisers were keen to get feedback on which ones were of the most interest. Some comments were made about making sure that any future events would be held in other locations; the Microsoft offices are great, but not everyone can get there easily. Although there were no commitments, it seems that they intend to try to cover more of the major population centres than before; and that can’t be a bad thing!

I have to be honest I do enjoy these sorts of events. I feel quite strongly that those of us that work in IT can all too easily develop a “silo mentality”. We get so wound up with day to day problems, and all too often work in small groups, and it’s far too easy to forget about the bigger picture. This can also make the job less enjoyable; it’s just too easy to find the passion for the work drifting away. By going along to the various sessions, it’s possible to see new ways of working that might otherwise pass us by, to meet with other professionals and hear what problems they face. I find that it can help generate a new enthusiasm for the work that can all too easily be lost when you are dealing with very basic problems most of the time.

All in all, I found it to be an interesting, useful, enjoyable day. I suspect that future events will be along the same lines, but will benefit from the comments of those that have taken part so far. If you see one in your area, I would urge you to go along; it will most definitely be worth the time and effort.

Monday 5 December 2011

Jeux sans Frontiere

Back in the 1970s & 1980s, there was a TV programme called “It’s a Knockout”. This featured teams of people from across the UK competing in a series of increasingly silly games. These programmes were presented by the wonderful Eddie Waring and Stuart Hall; and anyone who watched, will remember the way that Stuart used to collapse in fits of uncontrolled laughter at the various antics.

The format was so successful that it spawned an international contest “Jeux San Frontieres” (Games without Frontiers), and towns from across Western Europe would take part, host these crazy contests. It was a lot of fun, and sometimes I wonder if it wouldn’t be a good idea to resurrect the concept.

I mention this because it’s clear that there are a lot of companies in the SME market that are now having to deal with cross border relationships; even quite small businesses like ours are able to sell to other countries thanks to the power of the Internet. In our case, we have offices in other countries, and there is a need for our IT staff to support users in those countries as well as in the UK.

This is not easy. I now have an enormous respect for those support people in call centres that provide facilities for multi-language telephone support. Bearing in mind that my French was learned in school some 40 years ago, and was of the “Ou est la plume de ma Tante?” method of teaching, I was quite nervous to have to try and deal with potentially complex technical issues in another language.

Part of the problem is having the confidence to try to speak in another language, particularly if you don’t do this regularly. If you mutter something in an embarrassed way, and the other person then responds with an impatient “Quoi?”, then it’s easy to get nervous and that just makes things harder. However, it’s surprising just how much you can communicate with a relatively small vocabulary and if you speak confidently.

Try this; think of a phrase or sentence at least a couple of dozen words long. Now write out every third word on a piece of paper and give that to someone to read. The chances are that they will still understand what you mean with only the few words selected. There have been numerous studies and this has been proven to work in almost all cases (and not just in English), even when using complex phrases. It’s not necessary to get grammar or syntax absolutely correct, as long as you use the appropriate words. We just had to learn the right phrases, and be able to use them appropriately.

About a year ago, I bought an older server specifically to support a virtual platform. Then I used the Technet site to obtain copies of Operating Systems in the relevant languages we have to support. Although the configuration process and screens will be the same, it’s helpful to know some of the differences in technical names; for example, in French “Computer” is "Ordinateur", but “My Computer” is "Poste de Travail";. Getting the correct phrase is not just a case of a direct translation!

This has helped enormously, and I can confidently tell people on the phone to “Clicquez-vous en Demarrer", "Aide et Support", "Assistance a Distance"…” etc. etc. It’s also allowed us to take screen shots of the various windows with the appropriate language text in French, German and Hungarian, and these are used to create user documentation for inclusion in a FAQ section of our help desk software.

The end result is a better service for the end users. It makes them feel more confident in the support that we provide, and we have had some really good feedback from their staff. It also means that our support staff (i.e. me!) can feel a bit more comfortable when the dreaded “34” country code appears on the CLI of the incoming call.

Thursday 17 November 2011

Video Conferencing

This is a topic that crops up from time to time; and it’s one that I have some experience with.

A decade or so ago, people were selling Video Conference (VC) equipment for use with ISDN lines; these were OK, but there were technical issues with the data stream bandwidth and Quality of Service, and the user experience could be less than satisfactory. Pictures would be blocky or pixelated and even audio could be a bit of an issue, especially with multi way calling.

But the benefits to the business were really valuable, so people tolerated poor quality. Even if we only had a couple of VC meetings every week, the cost savings were very significant to the company that I worked for. At that time (2000 – 2004), we had calculated that we were saving around £25k to £35K per year. This was based wholly upon petrol / mileage costs saved with the sites about 200 miles apart.

When it became possible to use IP based systems, the quality of both audio and video improved quite considerably as the compression ratios were better and bandwidth higher and more consistent; and the user experience was such that people actually wanted to use the facility. I put this in at my current employer at all company sites, and I’ve estimated that we have saved around £450k to £500k over the last 6 years (for a capex of £25k and very little opex). This is based upon petrol / mileage / flights, hotel accommodation and subsistence allowances that would otherwise have had to be paid for.

This of course does not take into account the less tangible benefits; work / life balance (less travelling, fewer later nights), carbon footprint / environmental costs, user interaction. We found that most staff were able to collaborate better with VC meetings, and this generated some useful ideas which lead to key improvements in many areas. This also helps staff (and even some managers) feel more engaged within the activities of the business.

It’s become so valuable that we are now seeing senior managers wanting access to a VC function on their desks. We provide this capability through units which look like PC monitors, but can be switched to VC screens. We have experimented with smaller products; Skype, OCS / Lync and others, but the managers do like the larger viewing screen and it’s difficult to persuade them to use smaller viewing windows.

I think that almost inevitably, we will be moving to Tele Presence at some stage; once they see the improved quality of the product, I suspect they will be demanding it instantly. I’ve seen and think that it is pretty awesome; if you haven’t had the chance, then call your regional supplier, as they will be delighted to demonstrate their offering. Our current equipment is still functioning well, but has more than paid for its installation so replacing it would not be too much of an issue. The costs for purchasing the new hardware are a bit higher, but considering the cost savings, it would be well worth it.

Friday 4 November 2011

A-PDF Watermark Service is one of the best tools I have come across ..

For some time now, we have had a bit of a technical challenge within our Technical Drawing office. These guys produce about 2,000 to 3,000 different engineering drawings a week, all of which have to be saved and then accessed by a large number of people within the factory as well as others throughout the different sites belonging to the business.

We have a Document Management System that allows us to link the drawings to various modules within our ERP software; this is really useful as part of a drive towards using less paper throughout the business. This can be useful, but only if the file is attached to the right item straight away; and often that isn’t possible for a number of technical reasons.

The problem is that when you get that number of files, there is a key issue. How do you identify the right drawing and associate this to the file? We have tried a number of different methods with file names etc. but this doesn’t always help. Imagine that you have the printed drawing; it says that it is a left handed swivel arm, but how do you know what file that drawing came from if you want a second copy?

After some discussion, we decided that what we needed was a simple tool to allow us to imprint a modified file name onto the drawings which included works order number, quantity, and required date of the component. This would then allow anyone looking at the drawing to identify exactly what file the drawing came from and they could then quickly locate the relevant file and the also know where to look within the ERP system.

After some considerable research we found A-PDF Watermark Service from A-PDF. This useful little tool allows us to add those details of the drawing’s file name onto a designated place on each drawing; and it does so automatically.

http://www.a-pdf.com/watermark-service/download.htm

Using this product meant that we saved the time in hand writing (or typing) the information onto the drawings and it also removes the element of human error. It’s installed on the relevant file server and runs as a background service that processes the files automatically; and seems to easily handle the work load that we are throwing at it.

We highly recommend this product as a simple but effective solution.

Monday 27 June 2011

Office365 (part 2)

After my last post about Office365, I thought that I would write a bit more about why I think it would be such a good product for us; the rationale behind the thinking.

Some 10 years ago, less than half of the office staff had PCs, and there were perhaps 2 PCs in the factory area. Now, everyone in the offices has a PC (some have more than one) and in the factory areas, there are just over 2 PCs for every 5 staff. (These are shared by people and used as required to access relevant data.) As you can see, there has been a significant growth in the use of IT systems in the last decade.

About 6 years ago, some people start working with laptops and they were able to use VPN connections to get access to systems in the office, primarily for email when they were off site. To start with these were senior managers, IT staff and some sales people, but over the last couple of years, the number has increased to include many others. We even have a couple of ladies from our customer support team that regularly go out to visit partner companies that they work with, and they take a ?pool? laptop with them.

As you?ll realise, having access to email, CRM & ERP systems along with data files is pretty important for many of these staff and it helps them do their job far more efficiently. However, although the process to connect the VPN is really easy, some of them still occasionally have difficulties in making the VPN connections and we have been looking to see if there is a way to make their life easier.

One thing that was discussed in the Microsoft ?Jump Start? sessions a few weeks ago was the concept of a ?Hybrid? cloud; one that used both public and private cloud options linked together. In the session, there was a discussion about linking Office365 using LDAP to connect to an existing Exchange Server inside of a company?s LAN. Effectively, this would extend the mail function to allow Active Directory designated people when outside of the network to use Office365, and staff inside to use the normal Exchange Server; but the two linked together effectively as a single system and without the need for VPN connections.

I think that this could be a major benefit for us; it would make life easier for all staff that travel, as they would have access to their email without having to worry about running VPN connections. They could use their laptops, their smart phones, tablets or even a PC from the people that they are visiting to get access to their mail and other material.

As for staff inside of the business, they would continue to use the existing Exchange mailboxes; but they would still see the travelling staff as being on the same system. It might even be an option for some of the staff internally to use a tablet moving around inside of the factory; although I?m not sure that these devices are quite robust enough for some of the heavy handed individuals we employ!

Of course, there are security issues, but that is for a discussion another time. I feel that the hybrid option would make a lot of sense for us; it would provide a sensible and elegant solution to a problem that has caused a few issues and will only get more serious as time goes by. I think that Office365 is a product that deserves some serious consideration and could provide a real option for our travelling staff; and it might be a real advantage to the business.

Monday 13 June 2011

Office365

As promised in my last post, I’m going to write about the new Office365 product for which I have been testing the beta version. If you want to take a look at the beta for yourselves, then sign up here:

http://www.microsoft.com/en-gb/office365/enterprise/hosted-software.aspx?CR_CC=200038628&WT.srch=1&CR_SCC=200038628

Essentially, the concept is simple; this is an online product that provides the functionality of the normal Microsoft Office package. It’s run through a browser window, and the key thing is that it can be accessed from any device at anytime. All you need is a standard Windows Live ID in order to get access to the relevant portal.

The front page is quite straight forward and very “clean” and uncluttered; it gives a brief overview of what tasks need to be done and how to access the key components. There is also a link to support, the community forum, and information on how to perform key tasks.

The Outlook function is accessed from a menu item and is based upon Outlook 2010; even if you are using an earlier version, you will probably be able to work out how to do things. I tried this on an iPhone and there is a slight difference in the appearance as it uses the Outlook Web App (OWA). For those advanced Outlook users, there are a couple of functions missing; the public folders option is one item. However, I found it really easy to use, and I suspect that most others would have no trouble switching to this product from an existing version of Outlook.

There is also the calendar, contact list and tasks list functions as well. We use this on our normal Outlook function, so it might be something that we could use to good effect. For the contacts, we would need to find a way to separate out some of the entries as otherwise we would end up with massive longs lists making it harder for people to find what they need.

The Office365 product includes SharePoint Online; which is exactly what it sounds like. It seems to be based upon the SharePoint Foundation product, and offers the same kind of functionality. Although I had a few issues with the provisioning at first, an email to the Support Centre fixed that. I then found it really simple to set-up and use.

I’m actually a great believer in SharePoint; I think that it has a lot of functionality that would help fix a lot of business issues and provide a mechanism for resolving several key communication problems. The only downside is that it sometimes seems very difficult to get the users to understand that they can take control of many of their tasks; they seem to have a very fixed view that only IT staff can do these things.

Office365 also offers Lync which is the new instant messaging client; I thought it looked very slick and had a number of very useful additions compared to earlier products. Again this is something that I think we don’t make enough use of, and following a couple of tests, there are some key users that really like the product, but unfortunately there are many more that simply do not want to even try to use it.

Lync can also be used for audio or video conferencing; I did one very quick test and it worked well, but that was only between 2 users within our network. It would have been useful to test it against a couple more users for a slightly larger conference call; we may still do that another time as we still have over 140 days left on our 6 month beta licence.

The other main feature are the Web Apps for Word, Excel, PowerPoint and OneNote; very similar to the 2010 versions of the software and most people will pick them up very quickly. I’m not a great fan of the ribbon interface, but I suppose that I’ve become used to using it; and the Web Apps use the same feature, so it make sense to get used to it now.

There are a number of arguments about the use of cloud computing; that’s going to be a topic for another day. Suffice it to say that having tested Office 365, I really like it and most other users seem to find it very straight forward. We don’t know the price yet, but I have seen a couple of suggestions for the cost, and I think that it could be very affordable.

Office 365 is a really good product even though it is just the beta version so far; it’s one that I’ll be keeping an eye on over the next few months for sure.

Tuesday 31 May 2011

Office 365 Jump start sessions

Last week, I had the opportunity to take part in 3 training events organised around the new Microsoft Office365 product; the replacement for BPOS. These sessions were all online, run using MS Live meeting, with a mixture of PowerPoint slides and some actual demos of the product in use.

The sessions were started at 10.00 am Pacific Daylight time (18.00 BST) as they were being hosted from the West Coast of the USA. They ran until 4.00 pm PDT which meant staying up until midnight, very much a long evening, particularly as I usually get up at 6.30 in the morning. However as the event was so worthwhile, I don’t feel too put out by that.

(http://blogs.technet.com/b/uktechnet/archive/2011/05/11/register-now-for-the-office-365-jump-start-for-it-pros.aspx)

On the first day, they had a few technical issues with the audio at the very beginning of the session; for some reason, they kept losing the sound from the presenters. However, once that little hiccup was out of the way, the sessions picked up pace quite rapidly and they went through a great deal of information.

The moderator was Adam Carter who kept things moving along really nicely; he was joined by a number of people that had specific knowledge of key components of the package and these went into the various parts in some detail. At the same time, the online participants were invited to ask any questions; there were some really great issues raised and for the most part, the moderators were able to deal with these or to pass them on to the specialists for them to elaborate further.

I’ll write up a bit more about the actual product itself in a later blog post; suffice to say that the various components were explained and demonstrated very well. I would suggest most people had a really good opportunity to see them in action, learn a bit more about some of the basic administrative tasks required, and how to make use of the new product.

Of particular note was the session on using PowerShell to do some of the admin tasks; for those that are not so confident in using this utility or still working out if they need to use it, the demonstration showed just how flexible and easy to use it is, and I’m sure that many would have gone away determined to learn more about working with the commandlets.

For me the best demonstration was by Mark Kashman who gave a superb presentation on the use of SharePoint Online. He had created a demonstration site using the “Fabrikam” company name, and it was quite astonishing; simply one of the best SharePoint sites I’ve seen. A number of people asked if it would be made publically accessible as a reference site, and he has said they will look at this, but he felt that the site was still unfinished and that the team would want to do more work on it before releasing it into the wild.

All in all, this was a really great opportunity to learn more about the new Office365 product. It was very well put together and I think pitched at just the right level for most of the people involved. The slides are now available online to download –

http://borntolearn.mslearn.net/office365/m/officecrapril/default.aspx

They did suggest that the videos will also be available in a couple of weeks’ time, and if I get the details, I’ll add them on as well. They also promoted the Microsoft Virtual Academy, another really great free resource; if you haven’t heard about this, check it out at

http://www.microsoftvirtualacademy.com/Home.aspx

I hope that Microsoft put out a few more sessions like the jumpstart session; I would suggest that if they do, you would be well advised to sign up as it is a great training resource for IT sysadmins, to make that they stay on top of the latest products and developments.

Monday 25 April 2011

DPM across Domains

I've been using the Microsoft System Center Data Protection Manager product for just under 4 years; and I really like the product. As far as I am concerned, it ticks a load of the relevant boxes; easy to install, easy to use and manage, and most importantly, it works really well. It backs up to disk, then from disk to tape. It uses a relatively small amount of bandwidth and data recovery is quick and easy. It is simply one of the best backup products that I have come across, far easier to use than many of the more well known software packages.

A while ago, the company bought out a partner organisation. This left us a sales office based in Paris; they are a separate entity, but as they are quite small, they don't have their own IT staff. They have been using the services of another business, but it was decided a while ago that we would take on that responsibility. We needed to provide a backup function to preserve their data, and set about putting this into place.

One of the key issues was that they did not have an Active Directory domain on site. Everything was set-up as a workgroup only, and this causes a lot of issues. So one of the first things to do was set-up a suitable domain structure. Hopefully, this will reduce the amount of admin work that is required; previously, it was necessary to create a local user account on every single piece of kit, which required a lot of work. The new domain was created a couple of weeks ago, and we've now also created a two trust between the two AD domains.

The next step was to set-up the remote site to be backed up by our DPM server, but this was where we hit a snag. Each time we tried to install the agent, it responded with messages that the remote site was not available. I could prove that this was false; I could ping the remote server and even RDP to it from the DPM server. I checked all sorts of things, and each showed that the remote site was fully operational and accesible.

So I decided to do a manual install of the agent on the remote site. The first step was to RDP to the remote server, then create a mapped drive back to the DPM server. Having done that, I then opened the folder where the DPMAgentInstaller.exe file was found - that's at \Program Files\Microsoft DPM\DPM\Agents\RA\\i386 and there are also options for AMD & 64 bit installs.

This actually went through OK, and having installed the agent, it's necessary to define which is the correct DPM server. This is done using \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName . Again this went through OK, but it still produced an error message that there were insufficient permissions to complete the process.

After having checked the event log, I was able to see a number of LsaSrv Event ID: 6033 errors. This showed that I should modify the registry key \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName to disable the anonymous logon block. Having done this, it then showed another set of errors taht indicated that there was still a problem with permissions.

Having checked these yet again, I could see that the DPM server was in the correct groups etc. but I also thought to put the DPM administrator account into the administrators group account. Having done this, the error went away, but the agent still wouldn't connect to the DPM server. However, I ran the SetDPMServer.exe utiltiy again, and this time, it completed correctly. When I went back to the DPM console, it showed the agent as installed and connecting to the remote server.

So now we are in the position where we can actually backup that remote site. It will be a bit of an issue to begin with as there is a lot of data on site. I'll probably go over again, to do a manual copy of the data to a portable hard drive. This can then be manually copied to the DPM server to get the initial data load, and then the synchronisation process will only work on the data that has changed from that copy; a great deal less than the full synch process.

This is going to make a huge difference to the people on the remote site; they won't have to worry about tapes etc. or what to do if someone goes on holiday. The data is being backed up off site, so is more secure. The recovery process is really simple and we can give them the confidence that we can deal with it really quickly if needed.

Wednesday 2 March 2011

Transformational Security

A couple of weeks ago, I attended an event hosted by Computer Weekly, SC Magazine and a couple of others. “Information Security Leaders 2011: Transformational Security” - as you might gather form the title it was a look at how and why things are changing and how to provide security in the newer IT landscapes.

Although a lot of people think that these are just junkets, with a chance to pick up some SWAG and eat and drink at someone elses' expense, I actually find these events very useful. Working within IT can have its problems; all too often, we work in small groups, and it's very easy to become isolated. This means that we develop set habits, and forget that there may be other ways of doing things.

Getting out to events like this can be really useful in many ways. It's interesting to talk to others in the industry and see just what kinds of problems they are facing. All too often, we might think that we are the only ones with a particular issue, only to find many other people with exactly the same problem. I really like to share advice and information on how we approach some of these and how and why we go down the route that we do.

This particular event was very useful. There were some keynote speakers that offered a real insight into just how things are changing and why; and they offered some considered advice on how to look at this as an opportunity. In particular, the concept of "consumerisation" was raised - people wanting to use their own equipment that they use for home based email, social networks etc, then wanting to use the same items for work use. (That's not just the same make or model, but the actual device).

At first, I thought that this was not an issue that we would face; but then I realised that it has already happened. We have a number of staff that have their own mobile phone (smartphone device) that are then trying to connect up so that they can get their email on the device. It's not been a major issue so far; but what would we do if one of those people then left the company? (OK, cancel their email account is a start, but what if they had access to someone else's account as well?)

Or how would you react if they lost their mobile device and someone else found it and then could use this to get access to company systems. The answers may seem simple, but as the speakers pointed out, this is the thin end of the wedge, and it's going to start happening a lot more often and involve a lot more devices and people.

All in all, the event was a good day (and yes the food was good!); it was also very useful from the point of view of getting people to think slightly outside of their comfort zone. If there are any more events of this type, either this year or in the future, I would strongly recommend thaking the opportunity to get along. You won't regret it!

Wednesday 2 February 2011

Email signatures

Some time ago, it was suggested that we should have an agreed format for Email signatures across the company. Unfortunately, it took some time to get agreement on what format we should use. I could go into the details of this, but it's pretty boring; for example, the discussions on the font to be used seemed to take forever. Suffice to say that there were numerous discussions and it has taken quite a while for the final decision.

There are numerous sample VB scripts out on the Internet for producing an email signature, but none seemed to achieve what we wanted. I did think about trying PowerShell, but I don't yet know enough to be able to do the work using that. As I've used VB script on and off for a few years, it made sense to try and use that, at least for the time being.

The script has taken a little while to put together to make sure that it meets the needs of the business. It takes data from the Active Directory, formats it and places it in the required location. It also inserts a company logo, and there is a bit of conditional text to insert other logos; this is because we attend a number of trade shows, and like to promote these on our emails.

There is one slight issue; the email has to go out in Rich Text Format. If it goes as .HTML, the lines get double spaced. This is just because of the way that it gets rendered and I haven't found a way around this. Also if it goes out as plain text, the logo doesn't get inserted. It works by using Word - it extracts the AD data and sets the sig in Word before saving it in Outlook.

I'm putting the script below as I am quite pleased with it and the results; if it would be of any help, please feel free to make use of it. Just copy the text, place in a text file, save it and then change the extension to .vbs - I haven't tested it with all versions of software, but I have tried with Outlook 2003 / 2007 / 2010 on Exchange 2003 (on Server 2003), on PCs running Windows XP and Windows 7 and it worked in each case.

(Note that I have removed the specific details of our company so that it is a generic script; you would then have to modify it to show your own details.)

Enjoy!

====================

On Error Resume Next

Set objSysInfo = CreateObject("ADSystemInfo")

strUser = objSysInfo.UserName
Set objUser = GetObject("LDAP://" & strUser)

strName = objUser.FullName
strTitle = objUser.Title
strDepartment = objUser.Department
strCompany = objUser.Company
strOffice = objUser.physicalDeliveryOfficeName
strPhone = objUser.telephoneNumber
strFax = objUser.faxNumber
strMob = objUser.Mobile
strAddrs1 = "Site 1 Address"
strAddrs2 = "Site 2 Address"
strAddrs3 = "Site 3 Address"
strWeb = "www.domain.net"
Logo = "\\server\share\logo.jpg"
ShowLogo = "\\server\share\show1.jpg"

Set objWord = CreateObject("Word.Application")

Set objDoc = objWord.Documents.Add()
Set objSelection = objWord.Selection

Set objEmailOptions = objWord.EmailOptions
Set objSignatureObject = objEmailOptions.EmailSignature

Set objSignatureEntries = objSignatureObject.EmailSignatureEntries

objSelection.Font.Name = "Arial"
objSelection.Font.Size = "10"

objSelection.InlineShapes.AddPicture(Logo)
objSelection.TypeParagraph()
objSelection.TypeParagraph()

objSelection.TypeText strName & ", " & strTitle & Chr(10)
objSelection.TypeText strDepartment & ", " & strCompany & ", " & strOffice & Chr(10)
if strOffice = "Site1" then
objSelection.TypeText strAddrs1
end if
if strOffice = "Site2" then
objSelection.TypeText strAddrs2
end if
if strOffice = "Site3" then
objSelection.TypeText strAddrs3
end if
objSelection.TypeText strOffAddrs & Chr(10)
objSelection.TypeText "Tel:" & " " & strPhone & Chr(10)
objSelection.TypeText "Fax:" & " " & strFax & Chr(10)

if strMob <> "" then
objSelection.TypeText "Mob:" & " " & strMob
end if

objSelection.TypeParagraph()
objSelection.TypeText strWeb & Chr(13)
objSelection.TypeParagraph()
objSelection.TypeParagraph()

if strOffice = "Site1" then

end if
if strOffice = "Site2" then
objSelection.InlineShapes.AddPicture(ShowLogo)
end if
if strOffice = "Site3" then

end if


Set objSelection = objDoc.Range()

objSignatureEntries.Add "AD Signature", objSelection
objSignatureObject.NewMessageSignature = "AD Signature"

objDoc.Saved = True
objWord.Quit

Tuesday 4 January 2011

A third slice of V

Happy new year to one and all!

This post follows on from a previous item on virtualisation. We had installed the hardware, then the ESXi software - now to start getting serious.

ESXi does have a console to set-up certain key items, but these are very limited. Essentially, it allows you to change hostname, set IP addressing and some security, not much else. To manage the host machines, you have to use another piece of software; the VSphere Client which runs from a PC. I already had a copy of this installed on my laptop, from the tests that I had run earlier in the year. However, I decided to get the latest version so that we could start as we mean to continue.

The update went through quite quickly and after about 15 minutes, I had the logon dialog box. Put in the correct IP address and logon to the host; except that it came up with an "invalid user name or password" message. I checked the details and they were correct. I double checked the details; domain, username, password. They were definitely all correct. After having stared at this for a few minutes, I then realised that the host installation had used a US keyboard layout and I was inputting the details using a UK layout keyboard. When I re-entered the same details using the US layout, it let me access the host. And it appears that there is no UK layout option available on the host installation routine.

Looking at the details of the host, I could creat VMs and allocate resources; but this wouldn't allow me to manage the other hosts. To do this, I had to install the VCenter Server product and use to do all the management. The idea was that this would be installed on the first VM, but when I tried to install the software, it produced an error stating that it was not possible to install the server software on a VM. This made no sense; the material that I had received all indicated that the best practice would be to install the VCenter Server on a VM.

After some analysis, the solution became obvious; I had the wrong version of VCenter Server. I had downloaded it from the VMWare web site; once you get used to the site, it is quite sensibly laid out, but to start with, it can be a bit overwhelming. When I checked, there is a particular set of downloads to match the version of VMWare that we had purchased, and this was where I should have got the software from. So I downloaded that version; and yes, it installed straightaway.

So far, so good; I had the hosts running, the SAN was available and with the VCenter Server software installed, I could see all of the hosts and start to do some more detailed work. Unfortunately, we have a number of projects on the go at the moment, so I was involved in another one for a few days before I could get back to playing with the VMs.

When I did get back to the virtual platform, I wanted to make the storage on the SAN unit available. I was able to initiate the iSCSI connectors and these showed the disk allocation on the SAN unit. However, these were not available to the VMs; it needs a check box to be ticked for this to happen.

Later on, I realised that we still had an issue; although the storage area was available to VMs on the one host, it wasn't available to the others. Further checking revealed yet another setting (this time on the SAN itself) that needed to be checked, and as soon as this was done, each of the hosts could see all of the storage areas.

Unfortunately, I got this resolved after I had created the first VM and installed the VCenter Server. This means that the image and the virtual disk are actually stored in a local drive on the host server, which is not quite what was planned. It appears that this can't be moved using the VMotion process; but I may be able to get around this by using the P2V function at a later stage. If this works I'll write another piece about that later.

So at this point we had all of the hardware installed, all of the software licensed and running, our first VM created and some templates ready for future use. We can now manage the systems and have experiemented with copying, snapshotting, moving using the VMotion process, modifying resource allocation on VMs and deleting the various unwanted bits. It has taken a bit of time, but now there is a good level of confidence in the product and we are comfortable that we can move to the next level. And there will be more on that next time.