Models for Angular and ASP.NET Web API

I began working on a project recently with a friend of mine who is a .NET IT consultant. I originally planned to work with the familiar Durandal framework and ASP.NET’s Web API, but ultimately decided to go with AngularJS since they hired the Durandal guy. Both of these frameworks leave you to do your own model design and server integration. I decided rather than use something like BackBone that I’d tailor my approach more for Web API specifically. Whether that was a good choice remains to be seen, but here’s some sample code that demonstrates my approach to a generic javascript API client and a few models that consume it.

The below context module I wrote serves as my API interface. It can query, insert, update, and delete data. It’s basically just some helpers built on top of ngResource. One important aspect here is that the context calls return the result of the $resource calls, passing through the ability to use .$promise.then(somefunction) in my controller scripts.

context.js,

angular.module('voyager.context', [])
    .service('apiService', ['$resource', function ($resource) {
        var API = function (entity) {
            return $resource('/api/' + entity + '/:id', {}, { update: { method: 'PUT' }, insert: { method: 'POST' }, destroy: { method: 'DELETE' } });
        }
        this.query = function (entity, model) {
            var result = [];
            var api = new API(entity);
            api.query().$promise.then(function (data) {
                angular.forEach(data, function (element) {
                    result.push(new model(element));
                });
            });
            return result;
        };

        this.update = function (entity, id, element) {
            return new API(entity).update({ id: id }, element);
        };

        this.insert = function (entity, element) {
            return new API(entity).insert(element);
        };

        this.destroy = function (entity, id) {
            return new API(entity).destroy({ id: id });
        }
    }]);

Next is my models script. This could be split out for better organization, but I’m writing a fairly simple application, so I grouped everything under the “voyager.models” module.

modules.js

var voyagerModels = angular.module('voyager.models', []).
  factory('models', ['apiService', function (apiService) {
      var models = {};

      models.api = {};

      // begin Contact
      models.api.Contact = function (serverContact) {
          this.contactID = 0;
          this.companyName = "New Company";
          this.taxID = null;
          this.fmcNumber = null;
          this.address1 = null;
          this.address2 = null;
          this.city = null;
          this.stateOrProvince = null;
          this.postalCode = null;
          this.countryCode = 'US';
          this.contactFirstName = null;
          this.contactLastName = null;
          this.phone1 = null;
          this.phone1Ext = null;
          this.phone2 = null;
          this.phone2Ext = null;
          this.fax = null;
          this.faxExtension = null;
          this.email = null;
          this.comments = null;
          this.fullAddress = function () {
              var full = this.address1;
              if (this.address2) {
                  full += " " + this.address2;
              }
              if (this.city) {
                  full += ", " + this.city;
              }
              if (this.stateOrProvince) {
                  full += ", " + this.stateOrProvince;
              }
              full += " " + this.postalCode;
              full += ", " + this.countryCode;
              return full;
          };
          

          angular.extend(this, serverContact);
      };

      models.api.Contact.entity = function () {
          return "contacts";
      }

      models.api.Contact.query = function () {
          return apiService.query(models.api.Contact.entity(), models.api.Contact);
      }

      models.api.Contact.prototype.save = function (callback) {
          if (this.contactID) {
              return apiService.update(models.api.Contact.entity(), this.contactID, this, callback);
          } else {
              return apiService.insert(models.api.Contact.entity(), this, callback);
          }
      };

      models.api.Contact.prototype.destroy = function (callback) {
          return apiService.destroy(models.api.Contact.entity(), this.contactID, callback);
      };
      // end Contact
      return models;
    }]);

For brevity’s sake I’ve just included the Contact model as an example. Notice how I have included ‘apiService’, declared in my context module. Assuming you’re using the Angular project template, you’ll need to include both files in your BundleConfig.cs (just like all of your controllers) in order for this to work. Simply include the models factory in your controller (using the same syntax to include the API service) and you can retrieve, update, insert, and destroy Contacts with a single function call (i.e. myContact.update() or myContact.destroy(), or for retrieval use models.api.Contact.query()).

The Ruby challenge continues…

Well, this week I released my first Ruby app: jobs.workdistributed.com. I think I spent more time configuring the Ubuntu server it’s running on (RamNode FTW) than writing code; HUGE learning experience. I’m now much more familiar with:

  • Ruby (syntax & conventions)
  • Rails (the framework & conventions)
  • SendMail
  • Memcached
  • Mongoid
  • Stripe
  • Twitter (seriously)

Having a strong background with REST-style API development definitely helps with learning the Rails framework (it’s basically REST infused with awesome).

I’d post some code samples, but I didn’t write anything I thought was post-worthy.

My next Rails project will definitely have some samples, though. Stay tuned for updates on that. Meanwhile I have a contract job coming up that should give me a chance to learn about shipping and customs. I’ll post fun facts about that here as well.

A note on security

If you have an old Mustang SVT parked in your driveway, “cobra” probably isn’t a very secure password for your wireless network.

Got Ruby?

I’ve recently undertaken a personal challenge to become proficient in Ruby within a year (including rails). Coming from a Windows programming background, it’s going to be fun. I do have some *nix experience and can bang (pun intended) my way around in terminal pretty well already, so I’m not completely behind the 8-ball. Still, it’s a different culture altogether.

In order to teach myself, I’m writing a work item tracker called peachpack. You can watch my progress here.

So far it’s just a default page with devise authentication added on. Next step is adding some user roles and an admin panel.

ServiceStack and Json.NET

Greenshades uses ServiceStack to handle some portions of our service architecture. ServiceStack’s JsonServiceClient uses their home-brew serialization technology (ServiceStack.Text). While the serializer is indeed extremely fast (and for most purposes, there’s no reason to shy away from it), a product of mine had a need to preserve object references during serialization. As it turns out, there’s another popular JSON serializer (Json.NET) out there that enables this exact functionality. The simplest way to make this work is to serialize the complex object data on the client-side and pass it up in a JSON wrapper of some kind, but I decided I’d rather try to override the serialization functionality in ServiceStack to accomplish this. The following is an implementation of an ITextSerializer (using ServiceStack v3, but a similar IStringSerializer exists for v4 users). ServiceStack will allow us to override the default JSON serializer with it.

using Newtonsoft.Json;
using ServiceStack.DesignPatterns.Serialization;
using System;
using System.IO;

public class JsonNETServiceStackSerializer : ITextSerializer {
    public object DeserializeFromString(string serializedText, Type type) {
        return JsonConvert.DeserializeObject(serializedText, type);
    }

    public To DeserializeFromString(string serializedText) {
        return JsonConvert.DeserializeObject(serializedText);
    }

    public string SerializeToString(TFrom from) {
        string json = JsonConvert.SerializeObject(from, Formatting.None
            , new JsonSerializerSettings {
                PreserveReferencesHandling = PreserveReferencesHandling.Objects
            });
        return json;
    }

    public object DeserializeFromStream(Type type, Stream fromStream) {
        CompressorUtility compy = new CompressorUtility();
        byte[] serial = compy.DecompressBytesFromStream(fromStream);
        string json = System.Text.Encoding.ASCII.GetString(serial);
        return JsonConvert.DeserializeObject(json, type);
    }

    public T DeserializeFromStream(Stream stream) {
        return (T)DeserializeFromStream(typeof(T), stream);
    }

    public void SerializeToStream(T obj, Stream stream) {
        string json = JsonConvert.SerializeObject(obj, Formatting.None
            , new JsonSerializerSettings {
                PreserveReferencesHandling = PreserveReferencesHandling.Objects
            });
        byte[] serial = System.Text.Encoding.ASCII.GetBytes(json);
        CompressorUtility compy = new CompressorUtility();
        compy.CompressBytesToStream(serial, stream);
    }
}

In the above example I also used a CompressorUtility class to make the JSON payload even smaller. Here’s the code for that:

using System.IO;
using System.IO.Compression;

public class CompressorUtility {
    public void CompressBytesToStream(byte[] data, Stream stream) {
        byte[] compressed = CompressBytes(data);

        stream.Write(compressed, 0, compressed.Length);
    }
    public byte[] CompressBytes(byte[] data) {
        using (MemoryStream uncompressedStream = new MemoryStream(data)) {
            using (MemoryStream compressedStream = new MemoryStream()) {
                using (GZipStream compressionStream = 
                    new GZipStream(compressedStream, CompressionMode.Compress)) {
                    uncompressedStream.CopyTo(compressionStream);
                }
                return compressedStream.ToArray();
            }
        }
    }

    public byte[] DecompressBytesFromStream(Stream compressedStream) {
        using (MemoryStream decompressedStream = new MemoryStream()) {
            using (GZipStream decompressionStream = 
                new GZipStream(compressedStream, CompressionMode.Decompress)) {
                decompressionStream.CopyTo(decompressedStream);
            }
            return decompressedStream.ToArray();
        }
    }
}

Now comes the easy part. Just call ServiceStack’s built-in static method for overriding the serializer on both the client and the server before making/serving requests:

// using ServiceStack.ServiceModel.Serialization;
JsonDataContractSerializer.UseSerializer(new JsonNETServiceStackSerializer());