OFS06-IS – Building custom e-commerce solutions – faster and easier with Microsoft Commerce server 2009
As I have always wondered what Microsoft Commerce server brings to the table when building an e-commerce solutions, I found myself at the my first interactive session. for those not familiar with techEd’s session types an interactive session is held in a small cubicle type room seating around 30 people where the audience can really interact with the speaker(s). During the session Scott Cairney gave us an overview of the 2008 R2 edition, he explained what the team has done so far to enable a developer to make better use of the API. What was kind of impressive to hear that they have tested MCS R2 with a store of 1 million profiles and 1 million products and achieving an average 22 orders per second. It might look trivial to some, but just imagine your company having to package and label 22 big plasma screens a second, that 1320 plasma a minute, 79200 a hour etc. you’ll get the picture :-). These test get done in so called Technology Centre's, so my first question was: Can customers use these centre’s to test their products leveraging Microsoft technology; although Scott didn’t tell me how the answer was YES!. Simulation applications with these kinds of numbers is pretty hard/costly to do within own environment because you’ll have to set up a lot of hardware and most companies only have limited Test/Acceptance environment. Back to MCS R2, Scott showed us a couple of demos where they leverage Microsoft SharePoint with the numerous webparts that CMS R2 offers. The first question that popped in my mind was, what if I do not want to use Microsoft SharePoint? No, problem you can still use a lot of the webparts and use them in your own ASP.Net application, of course if you step out of the .Net scene and build and PHP frontend you’ll be building your own interface against the MCS R2 Webservices layer. In one of the demos Scott showed us how you could easily make e-commerce site with lot’s of micro-sites, each having a different layout. The differences in layout were created by levering xslt templates. I’m not a hero in XSLT so this might be old news to you, but I learned that you can use xsltactions in your template to call backed code services! I also asked about the ability of MCS R2 to produce facet navigation, but no joy here it will still be up to you to integrate your solution with another search engine products like Solr,Fast, Autonomy etc, to enable facet navigation. In my opinion a big feature lack in MCS R2, because 9 out of 10 e-commerce sites offer you facet navigation.
ARC304 – Command query responsibility segregation
Udi Dahan, need I say more? Well maybe, after last year’s session on how to Leverage the internet to cache your application, I’ve been hooked. Udi presentation style is just perfect, it a perfect fit to the high level concepts that get conveyed in his sessions at first it might seem slow but you’ll be needing the time to fully grasp the concepts out of his talks. In an earlier talk this year by Greg Young, I already got an impression on what you can achieve with this pattern, but Udi gave some more examples on how and where to leverage this pattern. In most systems data has a varying amount of staleness, lots of data gets cached for performance reasons, but without realizing, with caching you have explicitly defined your data as being stale. But none of the end users is aware of this fact. So why not make this explicitly known to the user. Imagine a data grid with all kinds of editable user data, when an user starts editing values on multiple rows he/she doesn't know if the edits that he/she performs can be saved. It might be that another user has already edited these values so these users will have an inconsistent save 1 record might succeed and another might not. Furthermore, from a domain perspective editing multiple records at once is not very intend revealing. i.e. if the data grid would contain user data (name, address) for an e-commerce application and you would change the address you might want the system to update all shipping addresses on old orders to the new address (or not), how would the system know how to do so? So when putting up you user interface make sure to convey the systems functionality and let the user convey their intend explicitly, when doing validation you must always validate the business rules within the system, but it is also important to validate the users commands not the users queries. When you have clearly split the commands from the queries in the system, why would one still need all kinds of layers of indirection to get data from the database? Just use DTO’s to read the data from the database, entities have no added value anymore. Furthermore when you have split the querying with the commands, why not split the data store into two data stores one for commands and one (or more) for querying. You can imagine the scalability improving significantly.
ARC310 – Application architecture guide: The map to your journey
Because of the lack on session information provided in my TechEd bag, I ended up at this presentation with a high expectation, because of the speaker: Don Smith. Unfortunately Don did a presentation on the 2nd edition of the Application architecture guide and to be quit frank I’m not wasting precious TechEd time to listing to a book presentation. So I got the message, get the book when available (the did a rush job to get them printed for the PDC)and went out within 10 minutes. Later I heard that after 30 minutes lot’s more people left.
EMB204 Smart metering as an enabler for home automation and customer interaction
As an IT guy one is always interested in cool gadgets and fancy gizmos, home automation is like one of the big nerdy gadget extravaganza’s of all time. But home automation isn’t nerdy is coming to your home more quickly than one realizes. Just like the Internet was a platform for the select few so many years ago, almost everybody is using it today. The presentation showed how in Germany peoples houses are being fitted with all kinds of metering equipment and are provided with an IP based gateway that sends the house metrics to systems in the Cloud. This information can then be accessed via a web portal so a user can view his or her energy consumption. As you can imagine the more sensors in the house providing data, the more information is available to the user. The sensors that can be used vary from simple how much current is running through me to measuring the amount of heat produces and lost in you house. The simplicity of installation, low cost of these sensors and controllers (and also funding from the government) and ease of access to a web portal, set a very low threshold for consumers to pickup home automation. Apart from the KEWL and WOW-factor involved it introduces a whole lot of new options, from turning on your coffee machine from you mobile phone to reducing your carbon footprint. Some of the funny trivia given where that saving up on electrical energy is almost pointless, from an economic point of view in northern Europe, but saving up on heating is the way to go. Although they expect only a 100.000 users by the end of next year, the infrastructure setup to receive and process all the information is highly scalable due to usage of cloud technology.
Microsoft itself offers guidance and solutions to common integration issues on http://www.microsoft.com/utilities
WIA403 – Tips and tricks for building high performance web applications and sites
INT201 The interoperability imperative
Due to a poor lack of resource planning on behave of the people of TechEd, the sessions we wanted to attend were full so we ended up at this session, nether less it was quite an interesting session. Sandra Schafer took us on a tour around the efforts Microsoft has taken to become a more open and compliant company as to adhering to international standards. Noticeable are the more then 2000 communications protocols that have been documented are freely available, OpenXML and the founding and sponsoring of the codeplex foundation. Working at the ICTU has left me with a JIKES feeling every time I see an email with an ODF attachment, so it was good to here that Office SP2 supports ODF for reading and writing. Sandra also pointed out that there is a great whitepaper done by Fraunhofer Focus on the comparison between OpenXML and ODF. Sandra also addressed the importance of inter operability on different level, showing that communication between systems equally important as the data portability between systems. A great example I always like to use, and have trouble with at home, is photo library software. This software enables you to categories/tag your photos so that in the multitude of photo in you library you can always find the photo you want quickly. But most of these software packages use a proprietary data store from which you can not make an export to another software package. Sound familiar?