Continuous integration of SSIS projects (Part 1). Build

January 10, 2017 | | Tags : SQL SSIS Build Continuous Interation

SQL Server Integration Services (SSIS) is a tool widely used by many companies. Unfortunately, it has been initially built a long time ago, back at the time when Build processes and Continuous Integration (CI) were uncommon. Since then, SSIS hasn’t been mostly changed. The only changes were - project deployment model since SQL 2012 and compatibility with new versions of SQL Server and Visual Studio shell.

Therefore the whole mentality of the SSIS toolset remained developer workstation oriented. Deployment package compilation and server deployment was done from a Visual Studio shell. MSBuild does not support SSIS Projects.

A few weeks ago I decided to automate the SSIS projects build and deployment. I came across a sample provided by Microsoft at http://sqlsrvintegrationsrv.codeplex.com/ called SSMSMSBuild. Initially I was planning on using just that, but then quickly realized that it is not what I wanted. First, you have to mock an msbuild compatible project file. Second, you should compile a dll from the solution above, and place it into a Program Files subfolder, where your SSDT installation is, and even then all what msbuild does - simply passes control and parameters to a method in that dll. Plus of course, the sample project was outdated and did not support versions other than SQL 2012, did not support encryption, and many other things, not to mention several critical bugs in that code. So I decided to create just a standalone SSIS Build utility that does what I need.

After a few night and weekends of development, I wrote this utility. It is capable of handling SQL 2012 and 2014 projects, including decryption and reencryption, custom parameters, release notes parsing, and other features. I still need to add support for reusable flow parts in SQL 2016.

The github repository is here: https://github.com/rtumaykin/ssis-build. Inside there is a working sample, including build and deploy scripts, as well as instructions. The tool is distributed as a Nuget package and the build script simply downloads on demand and executes the tool with all necessary parameters.

In my next post I will dive deeper into deployment, including automatic installation of custom components assemblies into GAC.


Plugins Isolation Without AppDomains Overhead

July 12, 2016 | | Tags : Reflection C# Impromptu Runtime Plugins

I’ve just released Impromptu - a lightweight .NET framework for plugin based systems. It replaces the default .NET assembly binding mechanism with a custom one. This allows to isolate the resolution of plugin’s referenced assemblies within the plugin’s own directory. As a result there is no need to isolate each plugin into its own AppDomain, which is extremely expensive and inefficient. The framework also makes plugin instantiation extremely fast, since upon the first plugin use, it creates, compiles, and stores a special static Instantiator class, which makes subsequent creation of plugin instances almost as fast as if it was bound during the compile time.

Here is a brief description of a problem that I am trying to solve:

Each AppDomain in .NET has a Base Directory where all application assemblies are placed when a .NET application is built. If an assembly uses types from another assembly, it stores a reference to the full name of that assembly. Build process copies the referenced assembly to the output folder of a referencing assembly. During the execution, all referenced assemblies are being looked up either in the Base Directory or in GAC (Global Assembly Cache). But what happens when both assemblies, referencing and referenced, are both independently referencing other assemblies that happen to have the same name? For example - one references Newtonsoft.Json.dll v7.0.1, and another - Newtonsoft.Json.dll v9.0.1? Since there can be only one Newtonsoft.Json.dll in the output folder, the first assembly wins, and only one version will be present in the output folder of an application. In any case there may be a risk that the second assembly, that lost, may have used some methods that only existed in its own referenced version of Newtonsoft.Json.dll. This will lead to the runtime errors. In the end these types of errors can be easily caught and resolved during the application testing, and developers can find a way to resolve these issues, one way or another.

Read more

DNS Proxy for Hyper-V with NAT Virtual Switch in Windows 10 v1511.

January 24, 2016 | | Tags : Hyper-V Virtual Switch NAT Networking DNS

During the November update, Windows 10 version 1511 Hyper-V received new Virtual Switch type “NAT”. This is a very important addition, since existing Virtual Switch types did not fully address a scenario of running Virtual Machines with Internet access on a laptop.

External switch works great when a Host computer is a part of a static network infrastructure, and you need to make your Virtual Machines transparently become a part of it. But when a laptop is used as a Host, then it becomes a constant nightmare, since every time the laptop connects to a different network, each VM on it would have to negotiate network settings with that network’s DHCP server, and if there is no DHCP, then theit IP addresses, gateways, and DNS settings would have to be set manually. Then they may need to accept terms and conditions of the network, etc. Plus you can choose only one physical adapter to use for the Virtual Switch, so if you use Ethernet in one place, and Wireless in another, then you would have to modify the External Virtual Switch setting every time you change the physical adapter that connects to the Internet. Not a whole lot of fun.

Read more

Simple Data Access Layer Generator.

January 17, 2016 | | Tags : SQL Data Access Layer C# User Manual

Introduction

A few years back I created a T4-based C# code generator that was generating enums based on the database tables. It was a very useful exercise. I learned how to build Visual Studio item templates, T4 templates implementation, plus I used the tool to simplify my own development, since I no longer needed to remember to update enums in response to the data changes. Later I decided to use my experience to build another tool, which I called a Simple Data Access Generator. In addition to enums generation, this tool was also generating C# code to call stored procedures and retrieve the results. In the end, calling stored procedures is always done the same way - open connection, create command, define parameters, assign parameters values, execute stored procedure, and finally retrieve return value, output parameters, and recordsets returned by the stored procedure. The difficult part has always been that SQL types are different from .Net types and therefore, after every call, it was necessary to translate everything into what C# could consume.

Read more

My recent presentation at LADOTNET User Group. Introduction to the SQL Server Service Broker.

January 4, 2016 | | Tags : SQL SQL Server Service Broker Talks LADOTNET

I and my friend and colleague Vladimir Sotirov had an honor opening 2016 LADOTNET user group year with our presentation of SQL Server Service Broker. I saved this presentation and code examples, including instructions at Gthub.

Previously I made a similar presentation at SoCal Code Camp 2015 at UCLA. Back then it was my first speaking experience. This time I did some homework, changed and improved the presentation and was feeling more comfortable and confident.

Thanks to Michael Chiang and Rich Kahn from Crescent Solutions for providing the venue and snacks, to Hattan Shobokshi and Aaron Stannard for providing a great example and inspiration.

Read more

Loading custom assemblies into SSIS Script Task without installing them into GAC - Part 2

December 9, 2015 | | Tags : SQL SSIS Script Task Assembly

In my previous post, I have mentioned using custom assemblies in SSIS script components/tasks without installing them into GAC. Still I was not fully happy with the solution, since you still needed to distribute SSIS packages separately from the custom assemblies, creating a possibility for mistake (version mismatch, invalid paths, etc).

Immediate idea was to embed required DLLs into the Script project. But because SSIS package writes everything into an XML document, any binary project items were immediately causing XML errors. So if I can’t embed anything that is not a text, what should I do? The solution is somewhat hack-ish, but it works. I can take an existing DLL, encode it using base64 to get ASCII body, make it build into an embedded resource, and then, before I need to resolve the assembly reference, read the resource as a string, decode it back to binary, save to a disk, and then return a path to a newly created DLL. This way I can deploy all of my external dependencies without worrying whether they exist on the destination server or not.

Read more

Loading custom assemblies into SSIS Script Task without installing them into GAC - Part 1

October 7, 2015 | | Tags : SQL SSIS Script Task Assembly

Recently I came across the article by David Browne, describing how to reference a custom assembly in a SSIS Script task or Script component without having to install it into a Global Assembly Cache (GAC). Since I want to use custom assemblies in some of my SSIS projects, I loved the idea to be less dependent on a particular server software configuration, and not having to load assemblies into a GAC of each server that the SSIS package might need to run.

I followed the instructions in the article, set up a simple SSIS package with just a Main() method and ran it. But my code would not run and throw an “exception has been thrown by the target of an invocation” from a Reflection. I realized that what happens is that when the Main method is loaded by a Reflection, none of my custom assemblies have been loaded yet, and therefore the required reference to a custom assembly does not exist yet.

So when I moved all of the logic that requires the use of custom assemblies into a different method, leaving the Main be just an entry point, the code worked.


Site-To-Site VPN connection between Amazon VPC and Google cloud

March 10, 2015 | | Tags : AWS Amazon VPC VPN Linux Google Cloud

Recently I needed to set up a VPN between AWS and Google Cloud. For my memory and to help others I publish the steps here.

  1. If you don’t have Amazon VPC (Virtual Private Cloud) set up, follow these instructions to create one. The instructions are for a VPC with a single public subnet. You can choose a different type, or just add a private subnet later.
  2. Allocate a new static Elastic IP for EC2-VPC on Amazon side. This EIP will later be assigned to the StrongSwan instance inside Amazon VPC, but for now we just need it to set up VPN on the Goole Cloud side. Refer to this document if you need instructions on how to allocate new EIP for your VPC.
Read more

Using Linux iptables port forwarding and Point-to-Site VPN to securely connect to SQL Azure without a need to open SQL Azure firewall

January 31, 2015 | | Tags : iptables Azure SQL Azure VPN Linux

I recently started using SQL Azure for one of my projects. It is very inexpensive, has way more storage than I need for my project. I use SQL Server Management studio to connect to the server, use SSL encryption to make my communication secure. However, I don’t work from the same place all the time. And every time I need to open SQL Azure firewall for the current IP of my internet connection and remove (for security reasons) these IPs as soon as I am finished using this network. As you can see - too many things to remember.

I also use a Point-to-Site VPN to connect to the rest of my resources that are sitting in the Azure Virtual Network that I created for more serious projects. Resources from within this network can access SQL Azure instances without any issue. The idea came naturally. Instead of modifying SQL Azure firewall settings, why can’t I simply use a proxy server inside of the Azure Virtual Network. All requests to this proxy on port 1433 would be forwarded to the real SQL Azure server and returned back to me.

Read more

Cloning Amazon Elastic beanstalk environment

October 1, 2014 | | Tags : AWS Elastic Beanstalk Powershell

In order to deploy new versions of software using Elastic Beanstalk, Amazon proposed to deploy it first to the new environment of the application, then swap environment CNAME records as it is described here.

In order to simplify creation of the new environment I created a simple powershell script that saves the current configuration of production environment and creates a new environment based on that configuration, removing the saved configuration after the new environment has been initiated.

Read more

Roman Tumaykin

My name is Roman Tumaykin.

I'm a senior database engineer at Rouse Asset Services. Technology is my passion. With a primary focus on database technologies (SQL and NoSQL), through my career at startups and small companies, I learned to wear many hats - from system administration to application and web development. Recently I started to speak at local user groups and at SoCal Code Camp.