Andy Herd’s CV

Current Health

Software/Data Engineer, Current Health                                                                     Sep 2020 – Present
Key technologies: Python, Redshift, SQL, Postgres, Git, Linux/Bash, Java, Terraform

FanDuel

Software/Data Engineer (Python), FanDuel                                                                   Jun 2018 – Sep 2020
Key technologies: Python, EMR, S3/Spectrum data lake, Redshift, SQL, Postgres, MySQL, Git, Linux/Bash, Terraform

As part of the data engineering team, we are responsible for the ingestion of data from a variety of sources (Postgres and AWS Aurora/MySQL databases; S3 and external APIs), building a data lake with data in S3/Spectrum and Amazon Redshift, including scripting and applying database migrations with Alembic. I am used to working with data in a variety of formats including Avro and Parquet. Many of my tasks have involved quickly learning a piece of technology that’s unfamiliar to me and then writing and testing software to put this knowledge into action. For example, I wrote some of our infrastructure for a new data lake platform using Terraform; and Apache Spark jobs to pick up and transform data from our external vendors.

I am often involved in troubleshooting Apache Spark, Hadoop and Hive on an Elastic Map Reduce (EMR) cluster. Additionally, I wrote and delivered a Python training course to an audience of c50 people. This group was spread across several locations, with a diverse range of experience levels.

JP Morgan Chase & Co

Senior Associate Applications Developer and Scrum Master
Oct 2016 – Jun 2018
Key technologies: Python (including some Django), Neo4j, Git, Linux/Bash

Wood Mackenzie

Senior Data Engineer, Wood Mackenzie Data Strategy & Mgmt / Data Science
Jan 2015 – Oct 2016
Key technologies: Python, Git, SQL Server, Neo4j, Cypher, MS Excel, MS Access

My role was a combination of business analysis and data process engineering to prepare the business for predictive analytics projects. My focus was on developing Python scripts to integrate with and enhance existing applications. I developed an oil price model using Python for data processing and linear programming with a front-end in MS Excel. This was the centrepiece in a commercially successful consulting project for a valued client.

Throughout my time at Wood Mackenzie I was often responsible for taking the lead on the data aspect of complex projects. My most satisfying moments were when I ‘reknitted’ processes; transforming a time-consuming manual process into something that’s slick and intuitive. As one of the leaders of a new program of “data process excellence” project I was instrumental in setting the vision for what Wood Mackenzie’s data processes should look like and guiding the team on how to reach our ambitious goals.

I launched data governance at Wood Mackenzie by performing a survey of data provenance, compiling the results into a Neo4j graph database and making the results available for my colleagues to access. I quickly spotted the significant opportunity which Neo4j presented, it made metadata easy to find and saves countless days on impact analysis by building the picture of how data flowed through the company.

Senior Data Analyst, Wood Mackenzie Energy Markets Team
Apr 2010 – Jan 2015
Key technologies: SQL Server, SQL Server Integration Services (SSIS), MS Excel, MS Access, Tibco Spotfire, VBA

On my own initiative I developed and launched a SQL server database of power plant data. My work made it possible for the team to transform a spreadsheet-driven approach into a global dataset which was subsequently launched to clients. I used SQL Server Integration Services (SSIS) to create an efficient transfer of data from the MS Excel front-end, which achieved quick, consistent and reliable performance for all colleagues. This was a feat which had never before been managed in the business, and I achieved it despite never having used SQL Server or SSIS previously.

I assumed ownership of the complex topic of units and conversion factors, which are vital to the energy industry. This involved researching and calculating every single energy conversion from scratch. I released my findings in a publication to every single client, a privilege only granted to a select few analysts. Building on my achievement, I developed an Excel & VBA tool which was rolled out to all employees, and my work made it possible for Wood Mackenzie to improve consistency in this vital business process.

Other noteworthy achievements included:

  • Rebuilt the web-based Wood Mackenzie Energy Markets data tool using Tibco Spotfire. This directly contributed to an increase in client usage of 240% in 2013.
  • In 2014 I was promoted to become a line manager.
  • Created a database to retrieve data from multiple sources, used daily by a team of analysts to save them time and ensure forecasts were aligned and integrated with other parts of the business.
  • On my own initiative I made countless improvements to MS Excel & MS Access processes to increase robustness, usability and reliability.

Scottish & Newcastle Pub Company (SNPC)

Finance Analyst
Oct 2007 – Apr 2010
Key technologies: MS Excel, MS Access, VBA

The focus of my role was in redeveloping processes to increase team efficiency, primarily using MS Excel and VBA. I used MS Access, despite having no prior experience with this application, to rebuild a reporting system to monitor capital & acquisition expenditure returns for Leadership Team reporting. Due to my extensive knowledge of Microsoft applications I delivered a training course on MS Excel to internal staff.