February 22, 2010 oslofeaturedcontent

Update for SQL Server Modeling CTP and Dev10 RC

We are currently preparing a release of the SQL Server Modeling November 2009 CTP that will install and operate with Visual Studio Release Candidate. We expect to make this release available the first week of March and will make an announcement here at that time.

We are also planning for another release of the November CTP that matches the final Visual Studio product (RTM) when that product becomes generally available.

February 15, 2010

Windows Phone Series 7 Link Roundup

I was swamped this morning, so didn’t get to see the Windows Phone Series 7 stuff live. When I did finally get to poke my head out, Scott Stanfield and Damir Tomicic had the list of links all set for me to follow. Thanks, guys!

Now I just have to let my AT&T contract run out and find a buyer for my iPhone. Can’t wait!

February 9, 2010 tools

Entity Designer Database Generation Power Pack

If you like Model-First design in Entity Framework, you’re going to love the Entity Designer Database Generation Power Pack. The original Database Generation feature in the Entity Designer in VS 2010 is extensible via Windows Workflows and T4 Templates. This Power Pack builds on these extensibility mechanisms and introduces the following:
  • Basic Table-per-Hierarchy support. This is represented by the Generate T-SQL via T4 (TPH)” workflow.
  • The SSDL and MSL generation pieces can now be tweaked through T4 templates, both in TPH and TPT strategies through the Generate T-SQL via T4 (TPT)” and Generate T-SQL via T4 (TPH)” workflows.
  • Direct deployment and data/schema migration are available through the Generate Migration T-SQL and Deploy” workflow. This workflow will use the Team System Data APIs to diff our default T-SQL script against the target database and create a new script which will perform non-invasive ALTERs and data migration where necessary.
  • A new user interface will now display when Generate Database from Model” is selected — this acts as a workflow manager” which will present to you our default workflows and allow you to create your own, customizable workflows based on your own strategy, script generation, and deployment requirements.

Highly recommended. Enjoy!

February 7, 2010 .net

Data Binding, Currency and the WPF TreeView

Data Binding, Currency and the WPF TreeView

I was building a little WPF app to explore a hierarchical space (OData, if you must know), so of course, I was using the TreeView. And since I’m a big fan of data binding, of course I’ve got a hierarchical data source (basically):

abstract class Node {
public abstract string Name { get; }
  public abstract IEnumerable<Node> { get; }
public XDocument Document { get { ... } }
public Uri Uri { get { ... } }
}

I then bind to the data source (of course, you can do this in XAML, too):

// set the data context of the grid containing the treeview and text boxes
grid.DataContext = new Node[] { Node.GetNode(new Uri(uri)) };
// bind the treeview to the entire collection of nodes
leftTreeView.SetBinding(TreeView.ItemsSourceProperty, ".");
// bind each text box to a property on the current node
queryTextBox.SetBinding(TextBox.TextProperty,
  new Binding("Uri") { Mode = BindingMode.OneWay });
documentTextBox.SetBinding(TextBox.TextProperty,
  new Binding("Document") { Mode = BindingMode.OneWay });

What we’re trying to do here is leverage the idea of currency” in WPF where if you share the same data context, then item controls like textboxes will bind to the current” item as it’s changed by the list control. If this was a listview instead of a treeview, that would work great (so long as you set the IsSynchronizedWithCurrentItem property to true).

The problem, as my co-author and the-keeper-of-all-WPF-knowledge Ian Griffiths reminded me this morning, is that currency is based on a single collection, whereas a TreeView control is based on multiple collections, i.e. the one at the root and each one at sub-node, etc. So, as I change the selection on the top node, the treeview has no single collection’s current item to update (stored in an associated view” of the data), so it doesn’t update anything. As the user navigates from row to row, the current” item never changes and our textboxes are not updated.

So, Ian informed me of a common hack” to solve this problem. The basic idea is to forget about the magic current node” and explicitly bind each control to the treeview’s SelectedItem property. As it changes, regardless of which collection from whence the item came, each item control is updated, as data binding is supposed to work.

First, instead of setting the grid’s DataContext to the actual data, shared with the treeview and the textboxes, we bind it to the currently selected treeview item:

// bind the grid containing the treeview and text boxes
// to point at the treeview's currently selected item
grid.DataContext = new Binding("SelectedItem") { ElementName = "leftTreeView" };

Now, because we want the treeview to in fact show our hierarchical collection of nodes, we set it’s DataContext explicitly:

// set the treeview's DataContext to be the data we want it to show
leftTreeView.DataContext = new Node[] { Node.GetNode(new Uri(uri)) };

Now, the treeview will show the data we wanted it to show, as before, but as the user changes the selection, the treeview’s SelectedItem property changes, which updates the grid’s DataContext, which signals the textboxes, bound to properties on grid’s DataContext (because the DataContext property is inherited and we haven’t overridden it on the textboxes), and the textboxes are updated.

Or, in other words, the textboxes effectively have a new idea of the current” item that meshes with how the treeview works. Thanks, Ian!

February 4, 2010 oslofeaturedcontent

Telerik LINQ to M Refresh for Nov09 Modeling CTP

The Telerik LINQ to M” implementation allows developers to use LINQ statements with blocks of M” values, pure text or the results of a transformed DSL. With the new SQL Server Modeling November 2009 CTP there are some changes to the M” specification, so Telerik has updated their core DLLs to accommodate these changes. Enjoy!
February 4, 2010 oslofeaturedcontent

Deep Fried Bytes: Doug Purdy on OData and Modeling

“In the 43rd episode of Deep Fried Bytes, Keith and Woody sit down at PDC 2009 with Microsoft’s Douglas Purdy to discuss all things data. Do you remember Oslo from the previous PDC event? Well Oslo has been rebranded to SQL Server Modeling Services to help developers store and manage models for the enterprise. Modeling Services enables you to more productive when building and managing data-driven applications. The guys also get the low down from Douglas on a new web protocol for querying and updating data called OData.”
February 2, 2010 oslofeaturedcontent

Rocky’s video series on SQL Server Modeling and CSLA

Rockford Lhotka has created a series of three videos showing how he has applied the SQL Server Modeling, specifically M”, to drive his well-known CSLA, a framework for building the business logic layer in your applications. He shows a custom domain-specific language (DSL) that lets you create a CSLA entity, along with the data serialization, business logic and a forms-based UI, resulting in a 95% coding savings (his words, not mine : ). Enjoy!
February 2, 2010 spout

We need cloud apps to use cloud drives

Reading about Windows Azure Drive reminded me of a conversation I had when I was hanging out with my Microsoft brethren last week. We started by talking about how apps target a particular OS and how Microsoft’s bread-and-butter is making sure that apps continue to work forever on Windows so that our customers can upgrade their OS and still get their work done.

We then moved on to wondering whether Apple was gonna do the same thing when it came to letting iPhone/iPod Touch apps run on the new iPad. As it turns out, we heard from the iPad announcement that Apple is doing just that (although in a particularly strange single-tasking way).

From there we moved on to how it’s really not a big deal whether you ditch your current smart phone, e.g. Dash, iPhone, BlackBerry, Droid, etc., for another one because nobody really keeps data on their phones anymore anyway. It’s either synch’d to their PC, e.g. photos, music, etc., or it’s kept in the cloud. In fact, without realizing it, I already have a great deal of info in the cloud:

  • Exchange: email, contacts, appointments
  • TweetDeck: twitter search terms
  • Evernote: random notes
  • Amazon: Kindle books
  • TripIt: trip itineraries
  • Mint: account and budget information
  • Facebook: social contacts
  • LinkedIn: business contacts

Further, I could keep my pictures in Flickr, my documents on Live and I’m sure there are many, many more. This is fabulous, because I can move from platform to platform on my phone and it’s in a vendor’s interest to make sure that each major platform has their app on it and because it’s a smaller, more focused platform, it’s easier for them to do.

The problem here, of course, is that we’ve moved from mobile vendor lock-in to cloud data storage lock-in. What happens when Amazon decides to repossess another book or Mint decides to start charging or Flickr goes out of business? Unlike the physical storage business (you know, little garages where people keep stuff when their relatives die or they’re going through a divorce), the logical storage business doesn’t have any legal responsibility to keep the doors open for 30 days when they go out of business to let me move my stuff somewhere else.

And this has already happened. When GeoCities went out of business, all of those people’s web sites were gone. When live.com decided to clean out my set of RSS feeds, there wasn’t any notification or recourse. I’m sure there are more similar stories and there will be lots more in the future.

And because I know there will be more, I’m worried.

Right now, as we move our apps and storage in the cloud, we have a very different dynamic then apps and storage on the desktop. Because apps on the desktop use storage I own, I can back up that data, import it into other programs and, if I feel like it, write programs against it. It’s my data. The vendor doesn’t ever even see it, let alone gate my access to it.

On the other hand, cloud app vendors ARE gating access to my data; I have to use their apps to get to it. Unless there’s some pressure, you can be damned sure the Flickrs and Mints and Amazons aren’t going to be giving up the data they’ve got a wall around now so that I can take it to a competitor.

Which is why we need control over the storage for cloud apps just as much as we do for desktop apps. I want to go to a vendor I trust, e.g. Amazon, Microsoft, GE, i.e. someone big, someone you know is gonna be around for a while, and purchase cloud storage from them. I want to be able to use it as a HD for my desktop data (like Azure Drive and other products before it), including general-purpose backup, but I also want my cloud apps to store their data there, too. That way, if they start charging or they go out of business or I want to go somewhere else with my data, I can do so.

I expect to pay for this service, of course. That way, the cloud storage folks make money, the cloud apps folks make money for using my cloud storage and I get peace of mind knowing that I’ll always have access to my data, no matter what happens to the cloud app or the cloud app vendor, just like today.

We need cloud apps to use cloud drives. Call your congressman!


← Newer Entries Older Entries →