Automation that paid for itself in a day!

The other day I was told that a client needed me to do something that made me cringe.

They wanted me perform a task that takes hours to do. My coworker recently had to do this task and it took him over 3 hours to complete.

Three hours is a really long time to do something that is mostly a copy and paste task in a system. I knew this was going to be something that my team and I were going to be asked to do again and again. I knew we had better things to do, so I decided to do something about it.

The task involves shipping software and I needed to link up and copy some settings and data for 2 new locations that a client wanted to ship from. This involves about 150 tables in the database. However, not all of those tables needed to have their data copied. An example of a table that didn't need to be copied over is one that stores a package that went out. A package can only go out from one location so there is no point in coping over that data.

Here is what I did

The first thing I needed to do was find all the tables that needed to be copied over. I found all the tables that referenced a location. Once I had that I grabbed all the columns for each table. I used the following query to get all the tables and columns:

SELECT TABLE_NAME, COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME IN (SELECT TABLE_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE COLUMN_NAME = 'location_id');

Now that I had all the tables and all the columns for each table, I wrote a script that generated all the insert statements that I would need and put them in a file for me to look at. All the insert statements had the following format:

INSERT INTO table (column1, column2, ...) SELECT column1, column2 ... FROM table WHERE location_id = 1234;

Now that I had all the insert statements in a file, I started looking at all the tables and trimming out the ones that didn't need to have their data copied. This process took the majority of my time. In the end, I removed about 60 tables that were not needed.

Some clean up

This is where things started to get fun. I wrapped the insert statements inside a transaction and ran them. I found a few more tables that I didn't need to be copied over because when they referenced a location, it was used to specify what the default location was for the client.

At this point, I added some variables to make sure the new location_id was automatically generated and a few other things that were specific to a few tables. I got it to the point where you only had to populate a single variable when you ran the script.

How that automation paid for itself that day

The results were much better than I thought they would be. It took me a little over 2 hours to get the script setup. It takes seconds to copy over all the data. As I mentioned before, it took my coworker over 3 hours to copy all the information manually. Since I had to copy over 2 locations that would have taken me over 6 hours to complete manually. So the very day I wrote the script, it paid for itself. I netted 4 hours saved on just that single piece of automation. All future requests are going to save our team over 3 hours. Not a bad day!

Happy coding!