By Philippe Leefsma
Here is the second part of our cloud viewer project. In the first post we took a look at how to create a basic WCF web service, host it locally in IIS and create an asynchronous client for our service.
Today we will take it a bit further by adding some Amazon API functionalities to our previous web service in order to host files and data on the cloud.
The goal in this post is to enhance our service so it can allow an “administrator” user (don’t see any permission related considerations here) to manage models, upload or delete them, through a console application and let the client users simply visualize those models on demand from a separate viewer desktop application. I just recall that the actual “models” here are simple picture files (.png, .jpg, .bmp) but the architecture is pretty close from the real viewer demo we are actually working on.
I - Enhancing the existing web service
We are going to add three new methods to the public service interface, in order to retrieve model data, that is actually an array of bytes representing a streamed image, then upload a picture and its info to the server and remotely delete it using the console.
The new methods and data structure we are going to use are exposed below, they are pretty much self-explanatory:
/////////////////////////////////////////////////////////////////////////////
// Web service public interface, those are methods exposed
// to the clients
/////////////////////////////////////////////////////////////////////////////
[ServiceContract]
public interface IAdnCloudViewerSrv
{
// Returns list of all cloud hoste models
[OperationContract]
ModelInfo[] GetDbModelInfo();
// Returns model data for a specific model
[OperationContract]
byte[] GetDbModel(string modelId);
// Upload model info and data to server
[OperationContract]
bool AddDbModel(RemoteModelData modelData);
// Remotely delete a cloud model
[OperationContract]
bool DeleteDbModel(string modelId);
}
/////////////////////////////////////////////////////////////////////////////
// The ModelInfo class keeps information about a specific model
//
/////////////////////////////////////////////////////////////////////////////
[DataContract]
public class ModelInfo
{
public ModelInfo(int height, int width, string ext, string id)
{
Height = height;
Width = width;
FileExt = ext;
ModelId = id;
}
[DataMember]
public int Height
{
get;
private set;
}
[DataMember]
public int Width
{
get;
private set;
}
[DataMember]
public string FileExt
{
get;
private set;
}
[DataMember]
public string ModelId
{
get;
private set;
}
}
/////////////////////////////////////////////////////////////////////////////
// The RemoteModelData allows
//
/////////////////////////////////////////////////////////////////////////////
[DataContract]
public class RemoteModelData
{
public RemoteModelData(ModelInfo modelInfo, byte[] data)
{
ModelInfo = modelInfo;
Data = new byte[] data;
}
[DataMember]
public ModelInfo ModelInfo
{
get;
private set;
}
[DataMember]
public byte[] Data
{
get;
private set;
}
}
-
II – Getting started with AWS API
The next thing to do is obviously implement those four methods in our service. That’s where the Amazon Web Services are coming to the scene. You will first have to register for Amazon, then for each web service you are going to need. In that case we need the following:
The signup page: http://aws.amazon.com/
Check the Developers section, it contains doc and links to all the resources about AWS APIs, classified by topic and by programming language
The file hosting service: Amazon Simple Storage Service (S3)
A simple database service: Amazon SimpleDb
We are not yet going to deploy our application to the cloud, that will be the topic in the next post. We will just run our service locally still, but it will connect to AWS on the cloud, so a working Internet connection is required to run it now.
We also need to download the .Net SDK, that comes along with various C# samples: http://aws.amazon.com/net/
After going through the getting started of each section and taking a look at the samples you should be able to put together some basic code in order to create a SimpleDb database, this will allow the service to access the information about which models are hosted on the cloud without having to analyze the data itself. And upload/download files from S3, where the actual image data will be stored and fetched to the client upon demand.
Here is some basic Amazon API code, you will notice that my access and secret keys have been omitted, this is what you will get once you are registered. Just to mention that the registration is free, but the use of the services is not, so keep that in mind.
string _domainName = "AdnCloudViewerTutDomain";
string _bucketName = "AdnCloudViewerTutBucket";
AmazonSimpleDB _sdbClient;
AmazonS3 _s3Client;
/////////////////////////////////////////////////////////////////////////////
// Constructor: initializes Amazon clients
//
/////////////////////////////////////////////////////////////////////////////
public AdnCloudViewerSrv()
{
try
{
AmazonSimpleDBConfig dbConfig = new AmazonSimpleDBConfig();
_sdbClient = AWSClientFactory.CreateAmazonSimpleDBClient(
AwsAccessKey, // place here your own key
AwsSecretKey, // place here your own key
dbConfig);
if (!HasDomain(_domainName))
CreateDomain(_domainName);
AmazonS3Config s3Config = new AmazonS3Config();
_s3Client = AWSClientFactory.CreateAmazonS3Client(
AwsAccessKey, // place here your own key
AwsSecretKey, // place here your own key
s3Config);
if (!HasBucket(_bucketName))
CreateBucket(_bucketName);
}
catch (Exception ex)
{
System.Windows.Forms.MessageBox.Show(ex.Message);
}
}
/////////////////////////////////////////////////////////////////////////////
// Checks if SimpleDb domain exists
//
/////////////////////////////////////////////////////////////////////////////
private bool HasDomain(string domain)
{
try
{
ListDomainsResponse sdbListDomainsResponse =
_sdbClient.ListDomains(new ListDomainsRequest());
if (sdbListDomainsResponse.IsSetListDomainsResult())
{
ListDomainsResult listDomainsResult =
sdbListDomainsResponse.ListDomainsResult;
foreach (String str in listDomainsResult.DomainName)
{
if (str == domain)
return true;
}
}
return false;
}
catch
{
return false;
}
}
/////////////////////////////////////////////////////////////////////////////
// Creates SimpleDb Domain
//
/////////////////////////////////////////////////////////////////////////////
private bool CreateDomain(string domain)
{
try
{
CreateDomainRequest request = new CreateDomainRequest()
.WithDomainName(domain);
CreateDomainResponse response = _sdbClient.CreateDomain(request);
return true;
}
catch
{
return false;
}
}
/////////////////////////////////////////////////////////////////////////////
// Checks if S3 bucket exists
//
/////////////////////////////////////////////////////////////////////////////
private bool HasBucket(string bucketName)
{
try
{
using (ListBucketsResponse response = _s3Client.ListBuckets())
{
foreach (S3Bucket bucket in response.Buckets)
{
if (bucket.BucketName == bucketName)
return true;
}
}
return false;
}
catch (AmazonS3Exception amazonS3Exception)
{
return false;
}
}
/////////////////////////////////////////////////////////////////////////////
// creates S3 bucket
//
/////////////////////////////////////////////////////////////////////////////
private bool CreateBucket(string bucketName)
{
try
{
PutBucketRequest request = new PutBucketRequest()
.WithBucketName(bucketName);
_s3Client.PutBucket(request);
return true;
}
catch (AmazonS3Exception amazonS3Exception)
{
return false;
}
}
-
III – More advanced AWS stuff
Amazon provides three different database technologies, each of them as its pros and cons. I pick up the SimpleDb because I wanted to get started with the easiest. If you are looking into more serious cloud applications, you may want to take a look deeper at each of them:
http://aws.amazon.com/simpledb/
http://aws.amazon.com/dynamodb/
http://aws.amazon.com/rds/
Let’s take a look in more detail at the actual implementation of my service based on the AWS API. Starting with “GetDbModelInfo”. This methode should return the model info that are stored inside our SimpleDb database. With this API it is possible to build up a sql-like string in order to query the database, so in that case I will request the “Height”, “Width”, “FileExt” and “ModelId” attributes of each model and return that info to the client in an array:
/////////////////////////////////////////////////////////////////////////////
// Returns existing models info stored in SimpleDb database
//
/////////////////////////////////////////////////////////////////////////////
public ModelInfo[] GetDbModelInfo()
{
try
{
if (!HasDomain(_domainName))
return null;
String expression =
"Select " +
"Height, " +
"Width, " +
"FileExt, " +
"ModelId From " + _domainName;
SelectRequest request = new SelectRequest()
.WithSelectExpression(expression);
SelectResponse response = _sdbClient.Select(request);
if (!response.IsSetSelectResult())
return null;
List<ModelInfo> dataList = new List<ModelInfo>();
SelectResult result = response.SelectResult;
foreach (Item item in result.Item)
{
ModelInfo data = DbItemToModelInfo(item);
if(data != null)
dataList.Add(data);
}
return dataList.ToArray();
}
catch
{
return null;
}
}
That second method will return the image data as an array of bytes that will need to be interpreted on the client side. The data has been stored previously on S3 and can be retrieved as a stream that we convert to byte[]. The request to S3 is very straightforward:
-
/////////////////////////////////////////////////////////////////////////////
// Returns image data for requested modelId
//
/////////////////////////////////////////////////////////////////////////////
public byte[] GetDbModel(string modelId)
{
try
{
GetObjectRequest s3Request = new GetObjectRequest()
.WithBucketName(_bucketName)
.WithKey(modelId);
using (GetObjectResponse s3Response = _s3Client.GetObject(s3Request))
{
return StreamToByteArray(s3Response.ResponseStream);
}
}
catch(Exception ex)
{
return null;
}
}
// From
// stackoverflow.com/questions/221925/creating-a-byte-array-from-a-stream
//
public byte[] StreamToByteArray(Stream input)
{
using (MemoryStream ms = new MemoryStream())
{
input.CopyTo(ms);
return ms.ToArray();
}
}
Then we need to implement the upload mechanism. In that case it is a two-step process: first we will store the image bytes on S3. S3 needs a unique key in order to store and retrieve the data. In that case I simply used the image file name, but my model viewer rather use a GUID, which can guarantee better the unicity of the key. Then we create a new entry with correct attributes in the SimpleDb:
/////////////////////////////////////////////////////////////////////////////
// Uploads a model to server, then adds it to S3 and SimpleDb
//
/////////////////////////////////////////////////////////////////////////////
public bool AddDbModel(RemoteModelData modelData)
{
try
{
if (!HasDomain(_domainName) || !HasBucket(_bucketName))
return false;
string s3Key = modelData.ModelInfo.ModelId;
PutObjectRequest s3Request = new PutObjectRequest();
using (MemoryStream ms = new MemoryStream(modelData.Data))
{
s3Request.WithBucketName(_bucketName)
.WithKey(s3Key)
.WithInputStream(ms);
S3Response response = _s3Client.PutObject(s3Request);
response.Dispose();
ms.Close();
}
PutAttributesRequest sDbRequest = new PutAttributesRequest()
.WithDomainName(_domainName)
.WithItemName(modelData.ModelInfo.ModelId)
.WithAttribute(new ReplaceableAttribute()
.WithName("ModelId")
.WithValue(modelData.ModelInfo.ModelId))
.WithAttribute(new ReplaceableAttribute()
.WithName("Height")
.WithValue(modelData.ModelInfo.Height.ToString()))
.WithAttribute(new ReplaceableAttribute()
.WithName("Width")
.WithValue(modelData.ModelInfo.Width.ToString()))
.WithAttribute(new ReplaceableAttribute()
.WithName("FileExt")
.WithValue(modelData.ModelInfo.FileExt));
PutAttributesResponse sDbResponse =
_sdbClient.PutAttributes(sDbRequest);
return true;
}
catch(Exception ex)
{
return false;
}
}
I do no put here the code to delete a model but you can find it in the attached project. It’s basically deleting the data from S3 and removing the item entry from the database.
IV – Self hosting the WCF service for easy debugging
Our service is now fully implemented and we can deploy it locally on IIS exact same way we did in the previous post. Unfortunately in real life you are obviously subject to generate a few bugs when you code the service, so you more likely are going to need to debug it. A straightforward way to achieve this is to self-host the service in a .Net console (or Winform, WPF, whatever executable…) application. The .Net framework makes it easy to do so by providing the “ServiceHost” functionality. The code for my console application is really simple. I also added a couple of lines which print the service endpoints, so no bad surprise:
class Program
{
static void Main(string[] args)
{
using (ServiceHost srvHost = new ServiceHost(
typeof(AdnCloudViewerService.AdnCloudViewerSrv)))
{
srvHost.Open();
Console.WriteLine("---- AdnCloudViewerHost ----");
ServiceDescription serviceDesciption = srvHost.Description;
foreach (ServiceEndpoint endpoint in serviceDesciption.Endpoints)
{
Console.WriteLine(
"\nEndpoint - address: {0}", endpoint.Address);
Console.WriteLine(
" - binding name: {0}", endpoint.Binding.Name);
Console.WriteLine(
" - contract name: {0}", endpoint.Contract.Name);
}
Console.WriteLine(
"\n\nService started. Press <Enter> to stop it...");
Console.ReadLine();
if (srvHost.State != CommunicationState.Closed)
srvHost.Close();
}
}
}
When using the service host, we need to customize a bit the config file, renamed now “app.config” (versus “web.config” for an IIS deployment). Here is the content of my config file, so now pay attention, the config file is a crucial part that determine how our web service is going to behave. You can see there I am defining a named binding configuration and service behavior that are affected to my service, with two endpoints, one for mex (metadata exchange) and the WCF client endpoint.
If you are new to WCF, I strongly suggest you spend a while getting familiar with config file creation. You can find more details on Microsoft knowledge base:
http://msdn.microsoft.com/en-us/library/ms733932.aspx
<?xml version="1.0"?>
<configuration>
<system.web>
<compilation debug="true" targetFramework="4.0"/>
<httpRuntime maxRequestLength="2147483647"/>
</system.web>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding
name="AdnCloudServicesBinding"
transferMode="Streamed"
messageEncoding="Mtom"
maxBufferSize="2147483647"
maxBufferPoolSize="2147483647"
maxReceivedMessageSize="2147483647">
<readerQuotas
maxDepth="200"
maxStringContentLength="2147483647"
maxArrayLength="2147483647"
maxBytesPerRead="2147483647"
maxNameTableCharCount="2147483647"/>
</binding>
</basicHttpBinding>
</bindings>
<behaviors>
<serviceBehaviors>
<behavior name="AdnCloudServicesBehavior">
<serviceMetadata httpGetEnabled="true"/>
<serviceDebug includeExceptionDetailInFaults="true"/>
<dataContractSerializer maxItemsInObjectGraph="2147483647"/>
</behavior>
</serviceBehaviors>
</behaviors>
<services>
<service
behaviorConfiguration="AdnCloudServicesBehavior"
name="AdnCloudViewerService.AdnCloudViewerSrv">
<endpoint
address="mex"
binding="mexHttpBinding"
contract="AdnCloudViewerService.IAdnCloudViewerSrv"/>
<endpoint
name="WCFClientEndpoint"
address=""
binding="basicHttpBinding"
bindingConfiguration="AdnCloudServicesBinding"
contract="AdnCloudViewerService.IAdnCloudViewerSrv"/>
<host>
<baseAddresses>
<add baseAddress=
"http://localhost:80/AdnCloudViewer/AdnCloudViewerSrv.svc"/>
</baseAddresses>
</host>
</service>
</services>
</system.serviceModel>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0"/>
</startup>
</configuration>
We can now build our service host and by using the client perform calls that we debug locally:
V – Finalizing the implementation
You now have two options to host the service, either through IIS or the ServiceHost. The third one being inside a Windows service. I suggest you investigate the pros and cons of each of them by reading further documentation:
http://msdn.microsoft.com/en-us/library/ms730158.aspx
The remaining work is to finalize the implementation of our client, so it can display the actual image data retrieved from the service inside a PictureBoc control and also create another WinForm executable that will act as a management console, allowing the user to select an existing image file and upload it to the server. It will also let the user delete the existing images. I used a simple UI design in order to achieve that. You can refer to the attached project that contains the full source code for each application. I ultimately redeployed my service in IIS in order to test it:
That’s the console UI and below is our client viewer:
Our project starts to become interesting now!
In the next post I will discuss how to actually deploy our service on a cloud based machine, so any desktop client viewer connected to the web can access our models…