BI Developer/Data Engineer
Who we are:
Ticker is an insurtech taking connected motor insurance to a much wider market, using the latest telematics technology and pricing methods. The executive team are insurtech experts, with a first-class track record in the industry.
What the role is:
We’re looking for an experienced Data Engineer to help build out our data analytics platform. You’ll be working in a modern cloud-native environment, delivering data analytics from data quality, modelling and ETL through to visualisation and performance tuning.
We use the best cloud-native, serverless technologies to help us focus on delivering features and value, rather than infrastructure. Our tech both delights our customers and goes unnoticed if we’re doing our job right.
You’ll be an essential part of putting data at people’s fingertips fast. It’s a fast-paced and ever-evolving environment, but you’ll be up for the challenge and happy taking the lead.
What you’ll be doing with your days:
- Developing reports and self-service analytics on top of our numerous data sources
- Working closely alongside our Development team and Business Analysts to ensure that accurate and timely use of data continues to be at the heart of our company
- Delivering insight from data to empower decision making
- Assisting in the use and monitoring of our operational databases – you’ll help ensure our people always have answers at their fingertips
- Development across our data stack to ensure the data quality of our ETL processes, automated reporting, self-service analytics
- Developing an in-depth understanding of the data within the organisation and ensuring accurate use and interpretation across the business
- Performance tuning and optimisation – we love getting the best bang for our buck
- Embracing the right tools for the job
- Enthusiastically learning new things to help us go faster and stay lean – we’ll support your ongoing learning with access to online training, books and tools
- Motivated and self-directed – you’ll have a drive to move our data platform forward strategically, while still meeting daily requirements and tactical needs.
- Passion for automation – you’ll make robots for the mundane tasks so you can focus on using your time for delivering even more value.
- An inquiring mind that needs to prove the accuracy of the numbers you’re presenting. You’ll be building confidence that the analysis of our data can be trusted, by both our stakeholders and customers.
Skills you’ll need:
- Data visualisation, based on your interest in how the human brain can easily understand and interpret dashboards, tables, graphs and other visualisations effectively at a glance. We currently use PowerBI, but exposure to Tableau or Databricks platform would be advantageous.
- SQL – whether that’s against Microsoft SQL Server or Postgres, you can handcraft SQL queries.
- Data-modelling – for self-service analytics in Power BI or Tableau, and creating historical data models for data marts/warehousing. You’ll be responsible for ensuring data is modelled correctly for our end users and system integrations.
- Data pipelines and cloud-based ETL solutions. We have a mix of data sources in our domain and you’ll be comfortable in picking the right tool for the job to get our data to where it needs to be. Ideally exposure to running data workloads on AWS or Azure, but ETL patterns and practices will be applicable to how you keep your data marts up to date.
Skills that would give you an edge:
- You’ll rapidly be moving beyond relational databases alone in this role – existing exposure to tools and platforms leveraging data lakes and non-relational data sources will set you apart. We’ve got many terabytes of data in S3 as well as AWS DynamoDB. Use of Presto/Athena, Parquet, Redshift, Databricks Delta Lake or any HDFS/Hadoop environment.
- Streaming analytics and low-latency reporting – event-driven analytics. We’ve got terabytes of telematics and operational quote data to be analysed with low latency.
- Non-relational and Big Data processing – any data science toolchain knowledge is beneficial (including Spark, Databricks, R and so on). Significant volumes of data in S3 for processing.
- AWS data platform and Azure data platform experience also beneficial.
- DBA experience – index tuning and optimisation – our databases are managed in the cloud but using them efficiently and keeping an eye on utilisation is key.
- Generous salary
- Bonus scheme (based on company and personal performance)
- Private medical insurance
- 25 days’ holiday plus your birthday off
- Enthusiastic investment in your development and ideas
- Free parking when in the office
- Flexible home working
Where we are:
Our office is in a rural location in Godalming (with direct links to London from nearby Guildford) but the business is remote-first. Location is not an issue as long as you’d be happy to come into the office to see the team semi-regularly.
If you’d like to talk about this role, please contact us at [email protected].
Go to all Ticker jobs