Data Warehouse Developer (m/f)
We are the leading mobile point-of-sale (mPOS) company in Europe. We started out 5 years ago and built our payment service from scratch to shake up the industry and wake up the entrepreneur within anyone. We created a unique device that with the smartphone that’s in your pocket, allows small merchants to accept card payments anywhere. Whether our merchants are brewing coffee or fixing cars, we want to make technology that everyone knows how to use. So our merchants can get on with what they do best. From our paperless onboarding to taking the first payment, we make it easy. Traditional offerings leave out small businesses, we don’t. We are open and honest about our pricing and have no hidden fees.
Today, hundreds of thousands of small businesses in 16 countries around the world rely on SumUp to get paid. In addition to our original hardware, mobile and web apps we have gone on to develop a suite of APIs and SDKs to integrate SumUp payment into other apps and services.
Why work for us?
We do things differently. We build our own payment solution end-to-end so that we can always offer the best value & service. We know how vital payment is to small business, so we use our technology to solve their problems. We believe in open and transparent communication, not strict rules and hierarchies. If you’re looking for the chance to innovate, and disrupt the payment industry – join us. We are a team of hardworking, talented people with one goal: to build a better way to get paid. We’ve got some huge challenges ahead of us, and we need smart, creative people to help us tackle them. If you think you’ve got what it takes - join us.
Sumup is looking for a Data warehouse developer with strong analytical skills, able to optimize and scale our data warehouse while working closely with our business development team. If you have skills in digging through data sources to see what's really there, good communication skills and you are a trouble-shooter, we want you!
- design, develop, and maintain data warehouse and analytics architecture to meet business analysis and reporting needs
- develop, test, improve and maintain ETL jobs and data flow scripts to fill data warehouse from production databases, cloud platforms and various external data sources
- design and develop data marts
- manage data quality
- build fast and scalable data pipelines between databases, data warehouse and cloud platforms
- optimize and scale data warehouse and data processing infrastructure
- contribute to development of data analytical and self-learning applications
- The candidate should have computer science or quantitative degree and at least 2 years of related job experience
- Python is a must-have; you should also have at least basic knowledge of Java or C#, and good knowledge of R
- Very good understanding of RDBMS (preferably Postgres). Knowledge of data modelling, normalization, denormalization, design of data warehouse and data marts. Very good knowledge of SQL and RDBMS procedural languages - pl/pgSQL is preferred, but experience with other SQL extensions like PL/SQL, T-SQL etc. is also accepted.
- Experience of making ETL jobs, knowledge of ETL tools (Talend, Pentaho, SSIS, Informatica etc.)
The following skills are an advantage:
- Knowledge of Salesforce, Salesforce API (very desirable), Salesforce Apex.
- Experience of building solutions on Heroku, working with Postgres on Heroku, using Heroku connect
- Experience in managing RDBMS systems, making replications, working with cloud-based RDBMS solutions.
- Experience of scaling Postgres for production/analytical needs
- Experience of fetching data from multiple sources (facebook, google analytics, mailchimp, twitter etc), knowledge of API’s, web scraping
- Experience of making data flows, real-time data processing, scalable systems, Kafka, RabbitMQ (or other messaging systems)
- Big data and/or noSQL: Redshift, BigQuery, Hadoop, Mongo
- Experience in software development (preferably Python)
- Knowledge in the field of BI, data science, analytics or statistics. Experience of applying statistical methods to analyse merchant behaviour, finding trends, patterns, making predictions.
- Knowledge of Spark, Scala or PySpark, and/or scikit-learn, pandas, other data science/machine learning libraries in Python or R. Experience in data analysis, data visualisation (matplotlib, ggplot etc.), implementation of machine learning applications for fraud detection, customer analytics, anomaly detection.
- Very good knowledge of MS Excel, VBA and Tableau, experience of making VBA applications for optimization, planning, making business or financial models. Experience of building Tableau dashboards.
- Flat hierarchies and the opportunity to have an impact, irrespective of your job description.
- A convenient location in the heart of Mitte at U-Bahn Oranienburger Tor.
- All the startup swag you expect: kicker table, table tennis, beer in the fridge, and more!
- Our office has a strong sense of community; we get together regularly for brunches, cocktail nights, soccer, and yoga.
- Our team comes from 24 different countries creating a fun, international environment.
Feel free to contact us for more info on our careers!