Friday, October 4, 2013

JSON schema validation in Web API project

Recently I've been working on integration point that accepts JSON object as a request payload over HTTP. This sounds really trivial when using ASP.NET Web API. Still there was an additional requirement to validate request against JSON schema that is distributed to customers as specification.

Since it was my first experience with Web API I started with a research and found this great article "How Web API does Parameter Binding". Here are 2 important notes from the article:
  • in my case using media type formatter seems to be better option than model binder
  • Web  API request body can be read only once
As for formatter Web API comes with JsonMediaTypeFormatter that can  be configured to use Json .NET for serialization. This is pretty cool since Json .NET supports schema validation. Unfortunately with current JsonMediaTypeFormatter implementation it is not possible to plug in validation code easily, but for now there is a workaround solution to decorate original formatter:
ValidationAwareJsonMediaTypeFormatter should be configured with typeSchemaResolver callback that should provide JsonSchema for payload type that is being deserialized. To hack the body stream that is already read by original formatter we seek the stream to the beginning and path it to validation reader that reports all found errors to standard IFormatterLogger. This approach requires body to be read twice and will work only for body streams that support seeking. Still it suites our needs and is better than implementing json formatter from scratch. If the following patch is applied things should become easier and less hacky.
Now we need to configure application to use new  formatter:

I do not describe details about IJsonSchemaProvider because it does not depend on Web API and can use any implementation (the easiest probably is a hardcoded dictionary). 
That is basically all - if your provider returns json schema for specific type, than request will be validated against it.



Sunday, September 8, 2013

Handing application enumerations and constants in plain sql scripts

Our team aims to develop applications with DDD approach and we're trying to avoid putting any logic into database. So in database schema we usually use integers for enumeration values, uniqueidentifiers for GUIDs ,  etc.But sometimes we still have tasks that require some logic to be written in plain SQL - usually it is data migration required by schema changes, some data related bug fixes and export to data warehouse to build some BI reporting on top of it. We've been using several approaches on writing sql scripts in a maintainable way, but not all of them are always cool.

Let's consider example with users of different types. User type is an enumeration, that is stored in integer column.
Let's write some queries that will return only Administrator users only.
The easiest way is just to inline integer values into sql script:
This approach is OK for trivial scripts that do not require support. Once you get a query referencing several enums and constants it will become complex and easy to make and error.

A bit better approach is to create sql variables: Still this approach has some drawbacks. If you need to structure your SQL code in several files then you'll have to declare same variables again and again per each file. Also sometimes for debug purposes one need to execute only particular statement in a big file. This may be not that easy if variables are declared in the beginning and you need to execute only something in the middle of the file
The approach that our team came up with during a huge data migration task is to use constant tables: All constant tables do not mess with application tables since they are defined in a separate schema. All constants are defined only once and may be referenced without additional declarations. Also a big benefit is that you get compilation an intelli-sence support for you constants in SQL :)

Tuesday, May 7, 2013

Verifying application emails from auto tests

Recently our team faced a challenge to cover user registration process by automated regression tests. Our setup is a web application with background services (that actually send out emails) and functional regression tests powered by Selenium WebDriver. Still the approach described below can be useful for testing other application types like desktop, web sites, etc.

The use case is pretty standard - user fills in some personal data on a registration form and application sends out an email to the specified address with the confirmation link containing unique token. The challenge was to automatically capture email message and extract the link.
In order to do real black-box testing approach with shared access to application database is not considered to be a suitable option. The alternative is to configure target application to either send emails via special fake SMTP server or drop them directly to a specific folder.

For manual email testing during development we've been using SMTP-Impostor for a while. There are similar tools like smtp4dev, FakeSMTP and others, but i did not find any that can be easily set up as a background service. We use dedicated test server and running desktop application all the time is not convenient (but still possible :)) in such setup.

If you're a .NET developer you get another cool option to configure your application (any application, not only web) to drop all emails to a specific pickup directory. Please find configuration  magic here.
And that's it - now we can find all emails that target application sends out as .eml files in the specified directory:

The way how to read and parse .eml files will depend on the platform you're building your tests on. Further i'll describe how it can be done in .NET solution.

To parse .eml files we can first install OpenPop.NET package.

Then create storage abstraction on top of pickup directory:
And finally filter emails using LINQ:
There is one more thing that we need to think about - schedule a job to clean up old .eml files. The following powershell script may be useful:

Conclusion
Setting up an infrastructure for testing emails is pretty easy and does not heavily depend on the application type and development platform.
Hope this helps :)