Upgrading Sharepoint 2003 to 2007 - Issues


There are a lot of different of different scenarios, issues and methodologies realated to upgrade from Sharepoint 2003 to 2007 (WSS or SPS).


  1. Check the service pack of SPS 2003 to match the prerequisites;
  2. Backup the existing Sharepoint databases
  3. Launch prescan /all on your server to be upgraded to check that Sharepoint is ready to be upgraded. You can find this utility on an 2007 installation under <%root%>\program files\common files\Microsoft Shared\web service extenstions\12\bin directory and copy it to the 2003 installation (allowing you to execute it);
  4. Manually fix all the errors found from Prescan;
  5. Proceed with the upgrade (accordingly to Microsoft brest practice);

Usually every time you'll find different errors depending to the Sharepoint topology and configuration.

This morning, I've found the following errors migrating WSS 2.0 to WSS 3.0:

  • Launching prescan /all the log files was:


12/11/2008 10:35:47 Scanning SPWeb: http://customer-server/test
12/11/2008 10:35:47   Updating list schema in web.
12/11/2008 10:35:47 Scanning SPWeb:
12/11/2008 10:35:47   Unghosted page: http://customer-server/xyz/default.aspx.
12/11/2008 10:35:47   Unghosted page:
12/11/2008 10:35:47   Unghosted page:
http://customer-server/xyz/Lists/Available Areas/AllItems.aspx.
12/11/2008 10:35:47   Updating list schema in web.
12/11/2008 10:35:47 Scanning SPWeb: http://customer-server/zz
12/11/2008 10:35:47   Updating list schema in web.
12/11/2008 10:35:47 Checking if Server="customer-server";Database="customer-content-database";Uid="userid";Pwd="userpwd";App="prescan.exe" is a WSS V2 SP2 database.
12/11/2008 10:35:47 Checking if any site has not yet been scanned in Server="customer-server";Database="customer-content-database";Uid="userid";Pwd="userpwd";App="prescan.exe".
12/11/2008 10:35:47 Error: The following site has not been scanned. Id = ef24fe63-efdf-42b7-b431-a9ae04abc24a and Url = /
12/11/2008 10:35:47 Checking if any list has not yet been scrubbed in Server="customer-server";Database="customer-content-database";Uid="userid";Pwd="userpwd";App="prescan.exe".
12/11/2008 10:35:47 Error: The following list has not been scrubbed: Id = b5f896c9-8f1c-40d2-9282-1154ccbaf6cb, Name=Web part galleries, Containing Web=
12/11/2008 10:35:47 Error: Prescan has encountered sites or lists that were not updated because they cannot be accessed using the SharePoint Products and Technologies object model. The most likely reasons for Prescan to skip a list are covered in the Knowledge Base article at:
12/11/2008 10:35:47 Skipping virtual server:
http://customer-server:81/. Server state = NeedExtend. Most likely this virtual server is not extended with WSS v2.
12/11/2008 10:35:47 Scan finished with failure.
12/11/2008 10:35:47 ===============================Logs===============================
12/11/2008 10:35:47 Log file: C:\DOCUME~1\sts_test\LOCALS~1\Temp\2\PreupgradeReport_633645885183825054_Log.txt
12/11/2008 10:35:47 Summary file: C:\DOCUME~1\sts_test\LOCALS~1\Temp\2\PreupgradeReport_633645885183825054_Summary.xml
12/11/2008 10:35:47 ==============================Totals==============================
12/11/2008 10:35:47 Number of sites skipped (already scanned):   0
12/11/2008 10:35:47 Number of sites scanned:   2
12/11/2008 10:35:47 Number of broken sites:   0
12/11/2008 10:35:47 Number of webs scanned:   554
12/11/2008 10:35:47 Number of broken webs:   0
12/11/2008 10:35:47 Number of webs using custom template:   0
12/11/2008 10:35:47 Number of pages scanned:   5848
12/11/2008 10:35:47 Number of unghosted pages:   1606 (27,46% of total).


Usually the first thing you've to do with this kind of problem is trying to fix them with:

STSADM -o databaserepair -url http://customer-server -databasename customer-content-database

With this command all the orphaned entities in Sharepoint content database are found and fixed (removed).

The problem is that SOMETIMES stsadm is unable to find orphans so that prescan doesn't complete succesfully and you cannot upgrade...

To fix that you can MANUALLY force orphans in the content database directly working with Sharepoint database tables.

The idea is getting the entities id from the prescan log file (in our example the site with id=ef24fe63-efdf-42b7-b431-a9ae04abc24a  and the list with id=b5f896c9-8f1c-40d2-9282-1154ccbaf6cb).

For force the orphans connect to Sharepoint content database and executes the above sql commands:

use customer-content-database

delete lists where tp_id='b5f896c9-8f1c-40d2-9282-1154ccbaf6cb'
delete sites where  Id='ef24fe63-efdf-42b7-b431-a9ae04abc24a'

Now go back to the command line shell and execute:

STSADM -o databaserepair -url http://localhost -databasename customer-content-database -deletecorruption

So that you will the orphans removed.

Now you can launch prescan /all and it should run succesfully.

Keep in mind that you've to manually fix each error discovered by prescan. Each time you've to find a workaround! This SQL code must be adapted to you scenario and you should have to deal with others Sharepoint tables!


Microsoft Virtual Server and Remote Desktop

This is for my colleagues getting scaried when they get The website cannot display the page with an Internal Server Error when they try to access Virtual Server WEB Management page from a Remote Desktop session:

it's an authentication problem! Virtual Server is running well!!

Use from Remote Desktop the Console session and everything will be fine (...or access the WEB page from another client and not locally by Remote Desktop).

How to start a remote console session from Vista and last updated XPs?

mstsc /admin (in the past it uses to be mstsc /console)

and then insert target server IP on the logon window



REST and Industrial Applications - An alternative to OPC

Technorati Tag: ,,

Writing software for industrial application is trivial and risky and a lot of different software platforms and hardware devices must be integrated in a common environment often evolved through tens of years.

In this scenario, one of the most important standard is OPC that defines sets of specification about how to produce and consume data, alarms and events generally produced and processed in a common industrial system.

The following picture is a sample scenario showing the concept behind OPC:


A sample scenario

On the bottom area there are some different sample classes of industrial devices:

  • PLC-A is a sample PLC which exposes a set of memory data (called tag).
    A tag is a calculated variable or the value of a physical signal. Usually tags are accessible using proprietary protocol like Siemens SH1 depending from the PLC used.
    In the sample the tags are named (A01, A02, A0n for PLC-A, Z01, Z02, Z0n).
  • PLC-B is another sample PLC, working in the same way as PLC-A but using GE EGD protocol to expose tags to the other systems.
  • Legacy-Z is a sample system implementing a complex mathematical-model and exposing data to the upper layer using a custom UDP protocol.
    This kind of system usually get data from PLCs, process them with feed-back model and generates setup data-packets sending them back to the PLCs. A lot of etherogenus operating system and programming languages have been used for delivering these application (Real-Time OS, Unix, Fortran, PML, C, etc.).
    In this sample we assume the some of the calculated values are exposed with the tags pattern (name, values) using a custom developed protocol.

Before OPC

As you could realize, before the OPC era, developing the upper software layers (like a databases with trend analysis, HMI modules with the user interfaces) required to create 1:1 connections with each integrated devices implementing custom protocol (SH1, EGD, TCP UDP, etc) from every client applications wasting a lot of effort in application plumbing.

For example, if you HMI needs to integrate the PLC-A, PLC-B, Legacy-Z you must have in your code the SDKs or the components for using Sinec-H1, EGD and custom UDP.The same for the trend-analysis database.

What is OPC

Referring to the depicted scenario:

OPC is a standard communication protocol to mediate and expose the underling protocols to the upper software layers through a single and standardized access model.

As you can see from the sample picture, the Trend server and HMI server are directly connected to the OPC Server using just a single protocol (the OPC protocol).

To implement an OPC Server you should get it from the market. There are a lot of different products (Kepware, OPC Power Server, Matrikon, etc.) and you choose one basing on the availability of the supported protocols you could need.

In the sample scenario there is a logical mapping between the OPC exposed tags values and the underlying tags (like the OPC tag 00-01 is mapped to the physical tag A01. the OPC tag FF-02 correspond to the physical tag Z02 and so on)!

One OPC Server integrates differents etherogeous devices using one common logical tags table.

The translation from the industrial protocols (Siemens SH1, EGD, etc.) is in charge to the OPC drivers (there are a lot of different drivers on the market). If you need to translate a custom protocol (from sample scenario the custom TCP UDP protocol ) you can write your own driver with existing OPC SDK.

Architectural Pattern

From an architectural perspective you can consider OPC as a common layer to map and provide access to a network of underlying devices and resources using a name/value addressing pattern.

When you implement complex mathematical models or process control software in a modern environment you'd like to leverage OPC to implements your common memory areas providing access to your computed variables to different algorithms and models.

So on complex mill there are different applications running on different hardware systems that need to share data in the same way OPC was build for.

The problem is that OPC isn't enough fast to enable process control software doing his tasks and also the code you've to write providing OPC access is trivial and "fat" due to the involved SDKs and components.

So the question is:

How we can leverage the OPC pattern to implement common data areas using modern technologies and providing access to shared variables?

My idea is the development a REST service.


Representational State Transfer (REST) is an architectural style to expose a set of connected resources (and their basic operation) usually leveraging the HTTP protocol suite.

It's a different thing than Web-Services and SOA. I guess to understanding REST the best approach is a sample:

  • A multimedia contents could exposes its multimedia catalog through REST providing basic services for updating them;
  • A complex system could exposes its configuration and metadata and clients browser could leverage rest to connect and update those data;

The pillar of REST are:

  • URI to address a connected resources (for example: http://mycontenctapplication/myCatalog/Author="Rocking Corrado";
  • HTTP verbs to specify an operation to the connected resource:
    • Get for fetch or read resource values;
    • Put for updating or insert of resource values;
    • Delete for deleting resources;
    • Post for appending resources;

Now the problem is how we can leverage REST for industrial applications?

REST and Industrial Software

If your software have to manage a complex site (like an Hot-Strip Mill), you've to deal with different applications and process controls (Furnace Control, Roughing Mill, Finishing Mill, Cooling Section) that have to exchange a lot of data and messages.

Those application usually uses custom protocols to exchange messages (pushing data with 1:1 synchronous interfaces) and using common shared areas for in-process communication.

The following pictures shows an Hot-Strip Mill process control system build using a REST architecture.


The Rest application will be developed with the following features:

  • A data-structure will be implemented to collect information from existing process control software;
  • The data-structure will be updated with standard interface with existing software for example TCP-IP sockets;
  • The data structure will be exposed with a restful interface;

If you will use REST to expose common area I guess that you will have a "closed" set of tags so that only GET and PUT verbs will make sense (consumer of REST will never add or delete new Tags).

Which are the main benefits of adopting REST in your process control software?

  • A standard protocol to expose data (HTTP)
  • Client application could access REST information in an easy way (XML and JSON);
  • REST is very well suited:
    • To provide access and update configuration data
    • To provide and update runtime information (like the Finishing-Mill parameters)

How we can write REST applications?

WCF Rest Starter kit is here (developed by MS WCF team)!

It has been published on Codeplex and it could included in .Net 4.0:

A set of templates are available for Visual Studio 2008:

  • Rest Collections/Singleton services;
  • Atom Feed/Atom Publishing Protocol
  • HTTP/POX services

And what about the c# code that we should write? Easy:



ProcessControlData GetTag(int tagId);

The WCF Rest Starter Kit also provides:

  • the WebProtocolException class to implement exception management in Rest services;
  • the RequestInterceptor class to manage the processing pipe-line of the service;


PDC 2008 - Day 4 - Sessions

Technorati Tag:

TL35 WCF: Developing RESTful Services by Steve Maine


Learn the latest features in Windows Communication Foundation (WCF)for building Web 2.0-style services that use URIs, HTTP GET, and other data formats beyond XML. See how these features can be applied to AJAX web sites, "REST" applications, and data feeds.

What I carryed out...

Great, great, great presentation. If you don't know the meaning of REST this is a must!

Talking about REST is long discussion for this incipit.

I just want to remember:

  • Rest Starter Kit it's a set of libraries and templates which makes easier the development of REST solution. I think that a big value in that is the exception management that usually is very tricky in this kind of architectures implementation.
  • The pillars of REST programming in .Net 3.5 are
    • [WebGet] + [WebInvoke]
    • UriTemplate
    • WebHttpBinding

BB12 .NET Services: Messaging Services - Protocols, Protection, and How We Scale by Clemens Vasters


Look under the hood of the Microsoft .NET Services service bus, the protocols we use, and how to use the services from non-Microsoft platforms and languages. Learn which part of the messages and requests the Building Block service inspects, which parts are not inspected, and how you can verify this. Also, learn how to work through NAT and Firewall limitations Last, hear about the architecture on the Data Center side that enables "Internet scale."

What I carryed out...


I didn't go to the previous presentation about messaging and event bus so it was very hard to understand some topics in this session.

  • There are a lot of different bindings to connect to the message bus. You've to evaluate and test the best-accordingly to you architecture.
  • The event-bus is a "queue" but it's not guaranteed to be fully reliable. You should consider it a buffer. This was a big surprise for me!

TL31 "Oslo": Building Textual DSLs by Chris Anderson, Giovanni Della-Libera


The "Oslo" modeling language can define schemas and transformations over arbitrary text formats. This session shows you how to build your own Domain Specific Language using the "Oslo" SDK and how to apply your DSL to create an interactive text editing experience.

What I carryed out...

How to implement a DSL to implement a new grammar.

How to generate assemblies implementing that grammar that could be loaded from .Net.

There is a huge value in creating old-style compiler. I'll blog later about it.

BB27 .NET Services: Orchestrating Services and Business Processes Using Cloud-Based Workflow by  Moustafa Ahmed


See how simple it is to use cloud-based workflow services to run business processes in the cloud as well as perform orchestration across on-premises and cloud services while running workflows in an environment that scales automatically.

What I carryed out...

When you create WF application you need an host. Now you can choose:

  • Your own implemented host;
  • Dublin (an application server)
  • .Net Workflow service on Azure platform

Why choosing Workflow on Azure?

  • It' scalable and if you need more performance you can buy it!
  • Reliable and available - it' hosted by MS Datacenters
  • Accessible from anywhere - it's a cloud, you can leverage it to connect you services over Internet


  • It supports .Net 3.5
  • There are new activities to interact with the Service bus
  • You can use the existing designer

My personal feeling is that you need to evaluate the worst case in a project to be sure that you can implement what you need. I'm scaried from wall in the environment when you're in a later stage in the project.


PDC 2008 - Day 3 - Other sessions

Technorati Tag:


The Keynote by Rick Rashid was very impressive. I already know about Microsoft Research, but I didn't know that it was wide and that so much people work on that. Google does a lot of marketing on their Google Labs but to be honest I'm more impressed by MS Research (which works in the shadow) and I hope that MS will not build a marketing strategy on that!

The video is here.

For people from Italy take a look to this.

TL06 WCF 4.0: Building WCF Services with WF in Microsoft .NET 4.0 by Ed Pinto


Eliminate the tradeoff between ease of service authoring and performant, scalable services. Hear about significant enhancements in Windows Communication Foundation (WCF) 4.0 and Windows Workflow Foundation (WF) 4.0 to deal with the ever increasing complexity of communication. Learn how to use WCF to correlate messages to service instances using transport, context, and application payloads. See how the new WF messaging activities enable the modeling of rich protocols. Learn how WCF provides a default host for workflows exposing features such as distributed compensation and discovery. See how service definition in XAML completes the union of WF and WCF with a unified authoring experience that simplifies configuration and is fully integrated with IIS activation and deployment.

What I carryed out...

One of the best presentation I've seen at PDC this year.

I will talk about Dublin as an hosting process in another blog. What I really apreciated in this session was the leveraging of Workflow Foundation (WF) for the management of asynchronous messaging in complex interacting systems scenario.

Why I've this feeling with asynchronous messaging? Because for my job (industrial and tracking applications) has a HUGE RELEVANCE so I strongly reccomend to my colleagues (Eros, Lucone, Gallo, Valerio e Luca) to take a look to the video (download it and play it on a airplane trip!)

Also it's very impressive how the editor of WF is improving. It's very closer to the Orchestration editor of BizTalk 2006 (that is born for the orchestration of business processes...)

PC22 Windows 7: Design Principles for Windows 7 by Samuel Moreau


Together, we can increase customer enthusiasm, satisfaction and loyalty by designing user experiences that are both desirable and harmonious. In this session we introduce the Windows User Experience Principles approach to shipping software. Along the way we share stories and lessons learned along the journey of designing the user model and experience for Windows 7, and leave you with a set of principles that you can apply as you build your applications for Windows.

What I carryed out...

A skipped lunch....

TL24 Improving .NET Application Performance and Scalability by Steve Carroll, Ed Glas


Performance must be considered in each step of the development lifecycle. See how to integrate performance in design, development, testing, tuning, and production. Work with tools and technologies like: static analysis, managed memory profiling, data population, load testing, and performance reports. Learn best practices to avoid the performance pitfalls of poor CPU utilization, memory allocation bugs, and improper data sizing.

What I carryed out...

This session were be of interest for my colleague AlessandroF because was based on VSTF 2010 and the new tools for perfomance testing.

The basic idea is that there are different tools to accomplish difference performances analysis requirements during the steps in project life-time.

  • During the "Gathering Requirements" you would use a tool to set the performance goal on different scenarios;
  • During the "Designing" you would run end-2-end test to evaluate your achitecture;
  • During the "Development" you would run tests and evaluating how your changes affects the previous release (the following pictures show an out-of-the-box report):


There is a strong interaction between the performance analysis tools and ALM of VSTF 2010 so that you can evaluate your progress during time.

Also there a lot of improvements in the tools theirs elf:

  • Now you can profile JavaScript!
  • There is a memory profiler tool and a Contention Profiler (this is very important for multi-core development, you can look to the lock and jump to the code that is causing the lock!)
  • Tools work remotely and under virtualization.

BB18 "Dublin": Hosting and Managing Workflows and Services in Windows Application Server by Dan Eshner


Hear about extensions being made to Windows Server to provide a feature-rich middle-tier execution and deployment environment for Windows Workflow Foundation (WF) and Windows Communication Foundation (WCF) applications. Learn about the architecture of this new extension, how it works, how to take advantage of it, and the features it provides that simplify deployment, management, and troubleshooting of workflows and services.

What I carryed out...

Dublin is one of my favourite technologies from PDC. It's an application server to host workflow instances.

It's also important for my job (in the industrial world but also for business process management and Sharepoint) so that I'll blog separately about it.

PC56 Windows Embedded "Quebec": Developing for Devices by Shabnam Erfani


Do you need to understand how to extend your applications and services to embedded devices using Windows 7 technologies? See the new Windows Embedded roadmap and hear plans for our next-generation offering built on Windows 7 technologies.

What I carryed out...

That's for my colleagues Eros and Martino which uses to work with Windows XP Embedded. Please read this!

  • Quebec is the new release of XP Embedded based on Windows 7 (yes, Vista has been skipped!)
  • No, It's not for Real-Time and you need Real-Time you need 3rd party extension
  • Language Independent (XP was based on English, here you can bind different language images)
  • Sensor SDK for development and integration of external sensors (but I cannot find anymore reference to this).
  • 64bit support
  • Minimum image size of 512MB (to fit on a Flash)
  • To create an image the following tools (Quebec image build tools) are available:
    • Image Builder Wizard (IBW
      Let you installs Quebec interactively or unattended
    • Image Configuration Editor (ICE
      GUI tool to create image configuration and distribution shares for image configuration
    • Deployment Image Servicing and Management (DISM)
      Installs M feature sets to an offline or online Quebec image
    • Windows PE 2.1
      Windows operating system with limited
      services, used for initial image installation
    • Syspre
      Removes system-specific data from an embedded Windows image
      Supports application plug-ins
    • Windows Deployment Services (WDS)
      Used for remote installation  of images on device
    • Additional tools for managing languages packs, drivers, and servicing

PDC 2008 - Day 2 - Other sessions

Technorati Tag:

Ok I've done hundreds of photo to the slides to discover that everything has been published 1 day later the presentation here https://sessions.microsoftpdc.com/public/timeline.aspx ! Happy for that :-D

It's hard to understand if it's good think, because a lot of people have payed to join the conference and to sell-back the new skills.

From my perspective the PDC it's a huge opportunity for a full immersion on a lot of new technologies having a direct feeling on the them. So I appreciate having the immediate availability of the slides and videos

BB36 FAST: Building Search-Driven Portals with Microsoft Office SharePoint Server 2007 and Microsoft Silverlight by Stein Danielsen, Jan Helge Sageflåt


The combination of FAST ESP and Microsoft Office SharePoint Server (MOSS) 2007 allows for the development of powerful search-driven portals. Learn about the architecture and functionality of FAST ESP, and see how FAST ESP can complement and extend existing search features in MOSS 2007. Watch a demonstration that shows how to create search user interfaces by configuring and extending the FAST ESP Search Web Parts, including the use of Silverlight to deliver unique search experiences.

What I carryed out...

At the moment Endeca is better than Fast (for my perspective) but there is a strong commitment from MS in inproving his platform.

The first result is the availability of that is a set of webparts on Codeplex to integrate Fast backend: http://www.codeplex.com/espwebparts

On a medium time perspective Fast strategically should be the best solution due to the strong integration with the platform.


BB26 SQL Server 2008: Business Intelligence and Data Visualization by Stella Chan


Learn how to create an entity data model and bind it to data visualization and ReportViewer controls. Dive into new Reporting Services features like: Tablix, new Data Visualization controls, and the new Report Creation experience. Also, preview the future AJAX ReportViewer control and the new RDLC designer.

What I carryed out...

My expectation was for a session which goes deeply inside BI topics discussing about mining but a lot of time went on graphics controls.

The interesting stuff was:

  • Take a look to Microsoft Charting Control for .Net framework 3.5. It's very powerful. In the past I always used OWC or 3rd parties.
  • Report Builder 2.0 is the new Report Designer shipped with SQL Server 2008 tailored to power users. Now the reports designed with could be hosted by Visual Studio 2008 Reporting Controls so that you can embed a report inside an application without having a full Report Server.

TL27 "Oslo": The Language by Don Box, David Langworthy


The "Oslo" language, at the heart of the Oslo modeling platform, allows developers to quickly and efficiently express domain models that power declarative systems, such as Windows Workflow Foundation and "Dublin." In this session, we'll get you started writing models for your own domains by introducing you to key features of the language, including its type system, instance construction, and query. You'll learn to author content for the Oslo repository and understand how to programmatically construct and process the content to target your own specific runtime environment.

What I carryed out...

Oslo was one of the top topics from the PDC and I'll about it to reorder my understanding.

In this (short) session David Langworthy was a little bit restless (maybe because Don Box was there) so the presentation wasn't great.

We've seen M (the textual-based DSL language and how to persists the modelled DSL to the DB).

The idea is that you can model a world with a textual-based language (for example defining entites such as PowerSwitch, PowerLine, PowerConsumer,...), also using the language you define the "plumbing", instances and attributes for each entitities persisting it to a SQL-Server Database.

Using M you can also query the DBs!

More or less you can think to M as a query and definition language for a DSL, in the same way like LinQ is a query language for SQL...

ES02 Windows Azure: Architecting & Managing Cloud Services by Yousef Khalidi


From design to deployment, building a scalable, highly available service is different from building other kinds of applications. This session discusses the impact that designing for the cloud has on all stages of the service lifecycle, and how the Microsoft cloud platform works for you to meet the scaling and availability goals of your service. This session will show how automation is used to free the developer from dealing with many hardware and networking issues. Also learn how the cloud services platform is architected to enable a pay-for-use dynamic model.

What I carryed out...


You've to think to an Azure solution in a total different way. Too early to say something more. I need to try writing some code to understand how it work, the walls and the real world application you can write.


TIQ-Industrial - New white papers released

Technorati Tag:

On TIQ-Industrial site, you can find some new white-papers:

  • Industrial sites vehicles tracking with GPS-DGPS-GPRS technologies
  • Data-Warehouse And Mining Tools  For Steel  Production Control
  • Sunsetting: A solution framework to revamp and integrate the Level-2 process control software
  • An Integrated Production Site

If you're interested on those topics, hoping you will enjoy them!

PS: I know, I know the aesthetics and look & feel of the site is bad! Give us more time, we're working on that...


PDC 2008 - Day 2 - Keynote

Windows 7 is here...

007 005

Steven Sinofsky (which report direclty to Steve Ballmer) talked about Windows 7. Here are the key messages from the presentation:

Windows 7 - A lap on User Interface

  • Improvement on Task-Bar and Tray-Area (notification management and Jump-List) ;
  • The wIndows are now dockable on the desktop and the UI seems a multi-user documents interface like Office clients;
  • Libraries are groups of storage resources (external disks, folders, etc.) that could be aggregated from the search point-view and you can search items on a specifice Library (ie My Videos...);
  • Home Groups are a set of shared resources that you can find and connect usually at home (printers, players, shared folders on NAS, etc,) to whose Windows 7 will automatically connect althouh if it's usually joined to a domain (like for a business laptop). Before Home Groups to get the same you need to write you own startup scripts;
  • Device Stage "defines" the plumbing of a device and to which other devices could be connected. Also it's single management entry point for each device;
  • You can play remotely an MP3 file or mpg video (right click and "Play to" with a list of target devices (players,  MM devices, ...) ;
  • Touch is definitevely the major improvement in the UI. I don't have an iPhone but should be the same but tailored to big screens;
  • BitLocker extended to USB and external storage devices. BitLocker is a tecnology to encrypt and protect data on storage devices. Now it work also with solid state-memorie.
  • VHD: It comes from Hyper-V, Windows 7 can manage different VHDs files and could be itself installed on a VHD because the boot manager can choose the image to start! I love this!
008 On the left the major improvements on Windows 7.
The new version is not a breakthrough like Vista compared to XP, but appears to be a major review to Vista improving it accordingly to users feed-backs.
Everybody at PDC has a lot of expectation about performances!
012 On the left the Jump-List over the Task-Bar
014 015 Device Stage where there is a single point of management for each connected device to an hosting PC.
016 A demo about Touch using new Home-PC from HP. You should think to an iPhone experience using a bigger  LCD...
Very interesting for Multimedia and obviously for Process Control Applications (SCADA) on plant production lines.
Keep in mind: Touch was also supported by Vista but now it has been strongly enhanced in term of user experience (for example with multitouch).
017 Windows 7 (the front-end) has been reengineered to stricly interact with Live Services (the users Cloud).
018 About the transitions the driver model and the drivers are the same of Vista, so there will not be the issues as moving from XP to Vista about devices compatibility.

Call to action fro developers, new features to be evaluated with an API:

  • Ribbon User Interface
  • Jump-List
  • Libraries
  • Multi-Touch, Speech
  • Direct-X
020 The Windows 7 Development project drivers:
Less memory foot-print and I/O to get more performance.
I wanna see the improvement on registry access that is one of the worst bottleneck in Vista.
022 The Pre-Beta is available now! I will evaluate if installing it on my laptop.
Following there will be Beta 1, Beta 2 and RTM.
The Pre-Beta will achieve the goals for performances but is not yet features completed.
Some other minor improvements:

The DLP integration and the extended desktop through Remote Desktop are great!
For the first time in a Microsoft Presentation :-D Steven Sinofsky showed a switching on the DLP by a laptop and it worked (with all the people clapping).
024 Call to Action for Developers!
025 027 Scott Guthrie explained why Windows 7 is a great platform for developers...
But this message was built on user interface improvements.
I should admit that DLP management and remote desktop extension are great for my job.
028 A Demo from Auto-Desk in which they leverage Microsoft Surface to improve the user experience of designer.
The demo was "cute" but I don't think that in real world application you can model your sketch with a finger... Maybe some application are feasible for fashion.
031 036Obviously both .Net and C++ development environment are supported by Windows 7
For C++ VS2010 is mandatory.
For .Net the 3.5SP1 fw release is ready.
039 .Net 4 is closer to be really available. Now there is CTP on a VHD for a first evaluation. 
A lot of improvement on WF, WCF and now Dublin is available (but I'll blog later about it).
The idea is that you can host and MANAGE workflow processes. Dublin "de facto" is a WF application server.
040 Extensibilty in VS2010, you can create your Extenston assemblies (for example to add features to the IDE and the editor like in the left picture) and deploy them just copying the DLL in the VS  extension directory.
No registration to be performed!
041 042 Very impressive demo from Tesco (the UK grocery market leader). They created a Windows 7 Gadget totally based on WPF which work with Touch experience to create a community of families, managing an agenda of deliveries and to leverage the webcam integrated in a pc to read the barcode and placing the order!
It was impressive.
044 JQuery will be part of the development tools of Microsoft and it will supported by MS as an open-source project!
JQuery is an lite JavaScript library that is one of the most popular libraries by web developer. In a very easy way you can interrogate (query) your html page so that is very usefull for creating Ajax applications.
047 In a Visual Studio solution (2010) now you can have different configuration files for different targent environment (like development, staging, production for example).
048 049Improved support from Visual Studio 2010 to SilverLight.
Now a Silveri Light designer is included directly in VS (and VS itself is based on WPF) and a toolkit with some additional control for Silverlight is available here.
050 051 Live Services, using the new development platform is very easy to create Mesh applications tailored to consumer users.
There is the Live Services framework (a .Net library) to create application in this environment
More information about this world here.
052 053 A demo from BBC quite impressive.
An offline multimedia player based on SilverLight!
It can interact with all devices  having Live Mesh installed to share the users preferences.
clip_image018 Office 14 is coming...
We've as seen some very impressive demos about the integration of Office Online within Office for desktops to collaborate on documents (OneNote, Word, Excel)
clip_image014 clip_image016In this demo over 2 concurrent PCs is possible to collaborate with OneNote using the PC client and the Online version at the same time!

The same approach is also possible using Word and Word Online.

I'd like to know more about Office 14 but I think that this will be the only Office 14 topic in all the PDC.
clip_image012 Is it Excel 14? No it's Excel Online hosted by Firefox.
clip_image022 And this is Microsoft Surface.  No mouse, no keyboard, just your hands.
We can think about application scenarios (...first of all high branded retail shop!)


Keynote from Chris Anderson and Don Box. It's not a speech instead is a Jam Session they're great (and they didn't use PowerPoint!).

The expectation from PDC people for them is the same for a concert of Bruce Springsteen...

The compressed a lot of topic in just 70 mins. very impressive:

  • They started from PDC 2005 demo built on LinQ to get the list of process from a system with a service (the provider exposing data through WCF) and a console client (the consumer)
  • The Service has been transformed in a simple Azure (cloud-based) , publishing it on the ServiceBus  And finally, they deployed the service into the cloud so that it was being hosted by the Azure servers.
  • Later they interact with LiveMesh using the same approach and just changing the authentication provider.

The most important message from this talk was that the service that they wrote was accessible through standards protocols.

Azure-based services are accessible through a “REST-ful” interface, using the existing GET/PUT/POST/DELETE verbs in HTTP to execute corresponding Read/Update/Create/Delete data operations on the service provider.

During the single step they just changes the URI of the client to change the plumbing and connects to different service (WCF, Azure, LiveMesh) leveraging the same pattern.


PDC 2008 - Day 1 - Other sessions

ASP.NET MVC: A New Framework for Building Web Applications by Phil Haack

I already knew MVC which is tool-set built over ASP.Net to implement the architectural pattern of MVC (Model View Controller).

During this session Phil Hack has explained some approach to use ASP.Net and IIS as a more effective platform to build WEB 2.0 applications.

ASP.NET MVC really looks alike Ruby On Rails and it's very use to use within Ajax.

The architecture of www.stackoverflow.com (maybe the first real application built with MVC has been explained by the CEO Jeff Atwod ).

MVC has been developed has an agile project collaborating with the community and it's supported through CodePlex. The official site is here from where you can download the toolkit and have access to all the documentation.

Identity: "Geneva" Server and Framework Overview by Caleb Baker, Stuart Kwan

Geneva (aka Zermatt) is a platform (server + development toolkit) for identity management and federation management composed by a set of tools and services:

  • Geneva Framework: is the set of .Net libraries to connect your applications to this environment;
  • Geneva Server: is the identity server to store the policies and to manage the plumbing of federation;
  • Geneva CardSpace: is the CardSpace release tailored to Geneva;

Geneva is a complex environment to define trusted set of resources based on policies and claims about the policies (yes, it's the same pattern of Kerberos that seems to have inspired Geneva).

The topic is hard and complex to be blogged but very interesting and it's a must for who need to create cross-company applications through HTTP which are federated (...or obviously if you'll write code for the cloud!)

The beta of Geneva is available now, the RTM for the second half of 2009.

The framework is available for VS2008, VS2010.

SQL Server: Database to Data Platform - Road from Server to Devices to the Cloud by David Campbell

David Campell explained how the world of databases is changed in the last 20 years.

Now you can choose from different DB technologies all based on SQL-Server.

  • SQL-Server mobile
  • SQL-Server
  • SQL-Services (aka SDS)

What's the best for what?

Depends from the scenario. You have different choices permitting you to leverage different hardware singularity in a whole overall solution.

It reminds me Jack Welch who used to day "Think globally, act locally" and that's true in a well conceived architecture.

My expectation from the presentation were to know something more about SQL Server Services, but we got just a bird-eye view.

One drawback: when you create solution for SQL-Services you cannot test them locally.

PDC 2008 - Day 1 - Microsoft Visual Studio Team System: A Lap Around VSTS 2010

Nice and interesting presentation from Cameron Skinner about VS2010 focused on the new testing and ALM feature.

037 Here are the requirements for the new release of VS2010!
Those requirements are translated in new functionalities and practices supported by VS2010 with a lot of small improvements which promises to simplify a lot the daily life for the developers!


038 This is the Bug Report view that has been strongly improved. Now the tester could also save a video to show the error to the developer and when the application crashes is it possibile to save the state and context (a dump) so that developers could inspect this file to evaluate what's happened leveraging the VS integrated debugger.

Some improvement included in the IDE:

  • Test Impacted: when you choose this function (with a right click) on a function you can see the tree of impacted tests (to avoid the butterfly effect that happens when you change a small part of code and a lot of unexpected functionalities are impacted!)
  • Test Validation: Before check-in a test-validation is mandatory to avoid to break a continuos-building process.
  • Sequence-Diagram: it's possible to generate a Sequence-Diagram to show the interaction between the function and methods.
039 On the left there is an architectural diagram of the name-spaces in a solution. It's a cool feature  because you can define architectural-constraints and when the code violate them, warning and report are automatically generated so that you can review the architectural assumption.

Test-Recorder maybe is the most complex feature add to the test tool-set in VSTS. You can record your interaction with WEB applications and Windows Form applications for performing regression-tests and including this "functional test" in the the validation process.

VSTS 2010 is including some functionalities that use to be part of other sophisticated and vertical quality control platform (like Compuware, Mercury, etc,)


  • Those are not the whole features of VS 2010 but just a small subset about the ALM for testing!
  • VS 2010 is a WPF application. That's cool! You can see a lot of improvements in the editors.
  • Drawback: Unfortunately the Test Recorder doesn't support SilverLight and I guess that a future support hasn't yet been
  • Technorati Tag: ,,
  • planned


PDC 2008 - Day 1 - Keynote

The "huge" news from Ray Ozzie keynote has been Azure which represents the coming of Microsoft in the world of cloud computing (a sexy buzz-word to describe a buzz-concept like Software+Services :-D )

008 013

My expectations from PDC were for a set of tools and maybe server platforms to create the "clouds" but Microsoft has gone beyond!


The main idea is to leverage the huge MS experience on data-centers (Live, MSN, MSDN, Windows Update, etc.) to create an hosted-horizontal platform that could be used by external developers to create their own service application without having to think about the engineering of the server farm (network, storage, disaster-recovery...). All is managed and maintained by the MS team.

So everything we use to know until today (the MS Server Platform or what is assumed to be "The Premises") is enforced with a new set of online "tools" hosted and managed on the MS datacenters spread around the world.

To have access to those tools obviously a license has to be purchased (but the sales strategy doesn't appear to be already defined). In 15 days demo access token will be released to the attendant of the PDC to test and play with the platform!

014 The On-Line platform is composed by a set of services (SQL Services, Live Services, Sharepoint Services, .Net Services, Dynamics CRM Services) that appear to be a "subset" of the original server application but tailored to work in a cloud environment.

To create your hosted-application, new project templates are available in VS 2008. What is really amazing is that you can run and debug your application locally and then deploy it to the online services!

To deploy the application just push the "Publish" button in Visual Studio.

Well, to be honest is not possible to debug locally SQL Server Service (but this hasn't mentioned during the keynote...)

016 You own Dashboard is available online to manage the application (I think there are 2 environments, for staging and production) but I didn't understand if the versioning is managed and how.

An application has been demonstrated (www.bluehoo.com) which leverage the hosted platform.

Ok there is Silverlight, there are animations. It's an application for Social Networking integrating Mobile- Phones with the cloud but I didn't understand the business model (and this is my concern, when we talk about Azure there is a technical perspective where everything is cool, and there is a business perspective where everything is dark...)


Bob Muglia talked about the impact on Enterprise companies.

The idea is that we're entering in a new era for computer applications (and that's potentially is true but I don't think the adoption will be so quickly).

024 What is very important for enterprise application is the extensiveness of the platform that is guaranteed by .Net Services (you can create and deploy you own .Net assemblies and integrating them with a set of integration services like workflow, identity management etc.)

Security, Identity Management and Federation are Key-Concept for Cloud Computing and everything has been released in this platform.

What I've really appreciated is the Active Directory connector that can be used to delegate authentication and authorization to the company system which is "federated" in the service.

There are some wizards which will guide you in the configuration of the "federation". It has been very impressive"


SQL-Server Services is the database offered by Azure. Don't think to it as an hosted full SQL-Server! It's a subset and you don't OLAP cubes, Integration Services, etc.

At the moment I don't know the available features but the idea should be having a relational online database for your on-line application. The implication is that you cannot use this as an alternative to SQL-Server. It' hasn't been conceived to store TB of data!

Also if your application need to integrate data from etherogeneus data-sources (not on HTTP) maybe is not a good idea to put it on an Online platform!

Azure has been created for HTTP applications! EAI and ETL cannot run on this platform!

So Keep it on mind! If you're from Italy and you're thinking to this platform to put-away the servers and the system engineers from you small to medium size company (a PMI) you're on the wrong-way

027028  029 

The final demonstration was driven by Shawn Davison from RedPrairie which explained how they delivered and integrated Microsoft System Center with an approach Software+Services showing some Dashboard and some reports build with Reporting Services 2008.


Technical Perspective

  • Everything seems to be amazing.
  • Which are the wall and the constraints imposed by the platform interfaces contracts? How complex could be the applications? Which are the functional boundaries?
  • Which are the performances trade-off and the suggested storage and transaction limits (the work-load)?

Business Perspective

  • A billing module is missing. That's a lack for a services development platform. I'd like to see off-the-shelves functionality to manage the billing of the services!
  • How the platform will be selled by MS?
  • Which is the partner ecosystem? Potentially some applications could be moved from small/medium size companies to hosted platform (Exchange, Sharepoint, CRM) shrinking the market for system engineers consultancy.
  • Which could be the business-model for service application hosted by Azure? Who will build them?? Maybe Azure could be for the innovative startup companies what myspace has been for the musicians...
Technorati Tag: ,,