This position reports to the BI Manager.
Analytics engineers care about problems like:
- Is it possible to build a single table that allows us to answer this entire set of business questions?
- What is the clearest possible naming convention for tables in our warehouse?
- What if I could be notified of a problem in the data before a business user finds a broken chart in Looker?
- What do analysts or other business users need to understand about this table to be able to quickly use it?
- How can I improve the quality of my data as it’s produced, rather than cleaning it downstream?
- We are currently working towards building a completely new Real-Time Event-Driven Architecture for data processing and data warehousing using open-source and server-less technologies such as Debezium, BigQuery,
- DataForm, Kafka, among others.
- This new Lakehouse will serve as the central source of truth, which multiple internal users will have access to, to drive their daily/monthly/quarterly decisions.
- Takealot is growing quickly, which brings a number of unique and interesting challenges. As such, data within the organization is also growing quickly. This brings a lot of opportunities for you to shape the tools, technologies, and culture around data in the company.
Your mission, should you choose to accept it:
- Lead, execute and maintain the Business Intelligence/Analytics Engineering strategy that will fit into our Group Data Strategy
- Architecting and implementing technical solutions to support scale and security initiatives
- Mentoring and coaching of data best practices within the organization. Driving the adoption of these best practices to ensure data hygiene.
- Defines standards and frameworks with regards to best practices in the Analytical Engineering, Business Intelligence and Software Engineering realm.
- Define and implement standard development methodologies to ensure that the team is using best practices for coding, efficiency, version control, QA and release management.
- Oversee operational support
- Participates in the project management estimation process.
- Manage backlogs and expectations with stakeholders.
- Identifies and provides input to new technology opportunities that will have an impact on the enterprise wide
- Business Intelligence systems.
- Ensure team projects adhere to deadlines, and budgets.
- Implementing features, technology, and processes that move us towards industry best practices, improving on scalability, efficiency, reliability, and security.
- Designing, developing, testing, and maintaining BI and data architectures.
- Preparing data for descriptive, predictive and prescriptive modeling.
- Automating repetitive tasks and manual processes related with the data usage
Optimizing data delivery. - Participate in the recruitment process, eventual mentorship and career path of employees.
- Work closely with stakeholders and BI Analysts to turn data into critical information and knowledge that can be used to make sound business decisions.
- Responsible for the full life-cycle development, implementation, production support, and performance tuning of the Enterprise Data Warehouse, Data Marts, and Business Intelligence Reporting environments.
- Design and implement reporting and analytical solutions.
- Analyses business and functional requirements and translates these requirements into robust, scalable, operable solutions.
- Help to migrate the old QlikView and QlikSense models to our new Event-Driven Kappa architecture built on BigQuery, DataForm and Looker.
- Ensuring that the data pipelines and general support infrastructure continue to run and operate in the most efficient manner.
- Provide operational support, bug fixes, and performance enhancements.
- Automating data extraction and report update processes.
- Data validation and integrity testing.
- Data cleansing and multidimensional data modeling.
- Optimisation of data models.
- Automation of Data model Generation and Quality Check
- Data Dictionary Design and Implementation
- Data Provenance and lineage tactics and implementation thereof
- Leveraging best practices in continuous integration and delivery.
- Collaborating with other engineers, ML experts, BI analysts and Data Engineers, and stakeholders to produce the most efficient and valuable solutions.
- Contributing to our data democratization and literacy vision by making accessible and easy-to-use data products and tools.
- Implementing features, technology, and processes that move us towards industry best practices, improving on scalability, efficiency, reliability, and security.
The skills we need:
- Works well with people, and is passionate about helping people be their best
- Is a team player, an active listener, mentor, and able to communicate well
- Shows solid reasoning and decision making, with the ability to work under pressure
- Is passionate about technology, systems and data
- Is curious, always learning, and keeping up to date with the industry
- Solutions-oriented, can-do attitude and high energy
- Excellent problem-solving skills
- Proven track record of delivering high-quality work, in a fast-paced environment
- Focus on getting the job done, but have fun doing so
- Taking ownership and displaying accountability in the work required of you
- Have an interest in e-commerce, building a brand and business
- Customer-centric
Qualifications & Experience:
- Bachelor’s Degree or Advanced Diploma in Information Systems, Computer Science, Mathematics,
- Engineering and a minimum of 5 years of relevant BI/Analytical Engineering experience in a software/technology environment is required.
- In the event that a candidate does not have a Bachelor’s Degree or an Advanced Diploma (in Information
- Systems, Computer Science, Mathematics, or Engineering), an equivalent experience requirement must be met, which equates to a minimum of 8 years of relevant Data Engineering experience in a software/technology environment.
- Experience leading a team technically, with a focus on a very high quality output is required
- Experience mentoring, coaching, and developing others technically
- Experience working within an Agile team, and an advocate of that culture
- Significant technical experience and a proven track record of data modeling and schema design
- A thorough understanding of database and data warehousing principles (e.g. OLAP, Data Marts, Star Schema, Snowflake, etc.)
- Write code (we use Java and Python)
- A thorough understanding of computer science fundamentals, including object-oriented design, data structures and algorithms
- A solid understanding of computer science fundamentals, including Linux and operating systems, networking
- Minimum 5 years’ experience using a BI reporting tool (e.g. QlikView, Tableau, Microsoft B, Looker, etc.)
- Experience using Looker, BigQuery and Dataform is advantageous
- Minimum 5 years SQL experience
- Have experience with Build Systems (Jenkins, Gitlab, Spinnaker)
- Have experience with Google Cloud, or another cloud provider (architecture, operations)
- High proficiency in working with large data sets and business models
- Experience with Domain Driven Design (DDD) is advantageous
- Retail or e-commerce industry experience beneficial