WCF Data Services – Entity Set Access Rules

WCF Data Services provides easy set up for your services.  An example of this is how easy it is to configure allowed operations on entity sets.  For this example, we have the following entity model:

image

* Note that I am not a fan of capturing and storing username and passwords.  I’d rather let someone else do this so I don’t have to.  I’ll post something on this topic in the coming weeks. *

If you generate a WCF Data Service, you’ll end up with:

public class UserService : DataService<coreentities>
{
    public static void InitializeService(DataServiceConfiguration config)
    {
        config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
    }
}

So what does this service do?  Nothing really?  By default, WCF Data Services locks down all entities so since we haven’t explicitly set what operations are allowed on the entities then we can’t do anything.  However, adding the common line

config.SetEntitySetAccessRule("Users", EntitySetRights.All);

changes this picture so that any operation (CRUD for instance) is now allowed.  However, there’s pretty granular control available.  Changing the line above to

config.SetEntitySetAccessRule("Users", EntitySetRights.ReadSingle
    | EntitySetRights.WriteAppend
    | EntitySetRights.WriteMerge
    | EntitySetRights.WriteReplace);

now allows retrieving a single user, creating a new user, and updating an existing user.  Retrieving multiple users is not allowed.  Combine this with QueryInterceptors and ChangeInterceptors and it becomes very easy to return only the logged in users user record and limits them to only update their own record. 

[QueryInterceptor("Users")]
public Expression<func><user , Boolean>&gt; OnQueryTasks()
{
    return result =&gt; result.Username.Equals(HttpContext.Current.User.Identity.Name);
}

There are a lot of different options available and thankfully the configuration model has been greatly simplified.  The days of configuring pure WCF will not be missed…at all.

WCF Data Services – Poorly Optimized Updates

As part of the next round of features for one of my apps, I needed to enable a synchronization / cloud scenario.  I was really loathe to head down the path of creating explicit methods for every type of access I needed so I thought it would be a good time to look into WCF Data Services.  After reading up on it and setting up some of the basics, I started trying to identify a more optimized approach to the updates. 

One of the scenarios I wanted to support was updating an entity that hadn’t just been retrieved from a query.  Just like EF, WCF Data Services has an Attach method that is geared towards attaching an existing entity.  My example entity is:

image

The simplified example looks like:

Category category = new Category
{
    // Created an entity with
    // the primary key and
    // referential key.
    Id = 1,
    UserId = 1,
};

category.Name = "Updated Name";

// context is the generated proxy for my WCF
// Data Services.
context.AttachTo("Categories", category);
context.UpdateObject(category);

context.SaveChanges();

This saves (concurrency is disabled here for simplicity).  However, the save overwrites all of the other properties.  So while Name is correct, DisplayOrder, CreatedOn, ModifiedOn, and Version have all been set to their default .NET values.  As this was my first foray into WCF Data Services, this was definitely not the behavior I expected.

I dug into this further and it looks like WCF Data Services does not respect the change tracking that the underlying entity does on the client side.  The UpdateObject method marks the entire entity as modified but the SaveChanges appears to disregard the actually changes that have been specified on the entity.  I of course can get it to work correctly by either querying the entity and then modifying it or populating it from storage based on a previous query.  However, the concern here isn’t just functionality it’s bandwidth.  For large entities or a large number of entities (such as a synchronization scenario), this can lead to a lot of unnecessary traffic which is going to be problematic for mobile clients.

I found a blog post that covered this problem and a potential work-around.  To be honest though, I’m a little surprised that there isn’t support for optimizing bandwidth consumption (especially given the mobile world).  I’ve reached out to Microsoft in hopes that my findings and the alternative suggested in the post is not the entire picture here.  I’ll update this post if I hear something back.

Follow

Get every new post delivered to your Inbox.