Niels Berglund’s “ADO.Net v.Next and the Entity Framework” was the first lecture of the day, I decided to skip Dino Espistos lecture after I read his slides and decided to go for something that I had not even looked at. I am glad that I did. Niels lecture was brilliant and he showed us some of the things that can be done with ADO.NET v.Next. Especially the abstraction of the DataModel into a more programmer friendly model. What this means is that you query something that is meaningful to you rather than querying the database that might be more databasey than you need. You configure ADO.NET via a series of files which map the Database model to a model that is conceptually similar to the domain you application is in. The example that Niels gave was the AdventureWorks database which is a highly normalized database that can be very hard to query against; with the ADO.NET v.Next framework you can map the multiple tables into a single .Net space view that you can then query (with ADO.Net handling the mapping between your query and the SQL database query). I am probably not explaining it that well, but I don’t think LINQ will replace this product, rather ADO.NET v.Next will remove the requirement of encoding the exact database structure into your LINQ queries and ADO.NET v.Next will allow you to query a model that you have defined via the Mapping files.
Kelvin Henney’s “Streamlined Object-Oriented Analysis. With UML and Use cases” was a very good lecture, he really knows his stuff. I say that because he could talk for ages and ages about the topic and it all made sense. His basic statement was that analysis shouldn’t be that hard, and a lot of the books about Analysis with UML are written by a developer for a developer perspective, and the result is that the writers tend to suggest that the Analysis stage is to basically do the design stage but leave out some of the detail. When in actual fact you should be modeling the system as it is currently, by looking at the systems and decomposing the problem into a resulting model. Models remove the fluff and unrelated information and show you what the problem is in terms that can be solved. Then from there you can set about solving the problem, starting with Use cases that rationalise the solution to the problem and the requirements into discrete packages of functionality and work. He also talked about modeling the interactions of the system from the point of decomposing the information, its relationships with other information and its intended flow through the system that you are creating. There was a lot more to this talk, and it also went into a lot more detail than I could write now. Kelvin, if you are reading this and it sounds rubbish please feel free to correct me.
Ingo Rammer’s first lecture that I attended “(Re-)Designing for Scalability and Performance” was packed out, and for good reason. It was a lecture about some of the (Anonymous) consulting experiences that he has had, and the steps he took to improve the scalability his clients applications. I thought this session was really interesting because it touched on some of the design decisions that we all make and how they can cause problems when it comes to scaling the applications that we create. He also gave some practical advice about data Caching, such as caching the data on a small SQL server instance that sits on the same machine as the Web server. I believe he implied that because it sits on the same machine it is very fast to get data from because it uses Named Pipes which are very very efficient.
Ingo Ramma’s second lecture that I attended was about Windows Workflow and how to Integrate it in to applications. I already know a little about Windows Workflow and this was the second lecture he did about the WF. I am glad I missed the first one, not because he is a bad talker, he is brilliant, but because I think the first one would have covered what I already knew to some extent. Ingo’s talking is quick and he gets through a lot of information, but I believe that he is very good at what he does. He was talking about some of the custom activities as well as asynchronous activities and how best to code them. I had a question to ask at the end of the presentation, and he was very personable and had a instant answer to my question, which cleared some of the understandings that I had of WF a little. Additionally, I was being nosey and I stayed a bit longer to see what other people where asking and he was very good with his answers.
I got an email from the Peter Lindsey, the managing director of Infragistics Europe about the experiences I had at the demo booth. I met up with in between meetings and I got to see the Infragisitics a bit better this time. Some of the XAML stuff they are doing is pretty cool, the data grid is smart and so is the the Carousel List box. I was attempting to implement my own carousel listbox in xaml a little while ago, and whilst if I had the time I think I could have got a fairly decent one made, the Infragistics one is pretty darn smart.
Olive360, so far I think the lectures are of an amazing quality, but if you are to improve on this years DevWeek I would make it easier and more appealing for vendors to be present. There are not that many companies here. I like to see what people are offering, I like to meet the people who are selling software so that I can see if they are simply resellers of a product or that the develop the software the The majority of the lectures were about .Net and .Net technologies, yet there were a couple of vendors that were c++ specific. Also, to some of the vendors. You were terrible! You came to sell something, yet I felt that the assistants were not really that bothered.
On a lighter note, even though I am an avid Resharper user, Mark Miller from Dev Express was their showing off his product, and it does look pretty darn smart. His presentations are slick, his attitude is spot on and he is even respectful of his competitors. If only there were more people like him.
I forgot to mention that yesterday I spoke to Developmentor, man do they seem like a really cool training company. They have experts in the domains that are new and current and the courses are their own, based around the trainers experiences and own knowledge rather than the course the Microsoft lease out to other Training companies.