Data Scientist

Overview

Established in 2010 to meet the growing technology needs of agencies in the UK and the Middle East, Luminescent Digital provides digital project support and software development teams to a broad range of companies based across the EMEA and APAC regions. Luminescent Digital has now become fully integrated into the Service plan Agency Group and is represented in 34 countries worldwide.

Job Description

We are seeking a highly skilled Data Analyst/Developer with extensive experience in Python, SQL,JSON, and JavaScript, as well as familiarity with AWS technologies. This role requires a candidate who can demonstrate high-level expertise in data analysis, ETL processes, and development workflows.

Experience with Palantir is considered an advantage.

Essential Skills :

1. High-Level Python Expertise:

 Advanced data manipulation skills using libraries like Pandas and NumPy.

 Proficiency in data analysis and visualization with tools such as Matplotlib, Seaborn, or Plotly.

 Strong ability to automate ETL processes and write efficient Python scripts for data extraction,transformation, and loading.

 Experience with API integration, including authentication, pagination, and error handling in Python.

Competence in writing unit tests, debugging, and optimizing Python code for performance, especially with large datasets.

Proficiency in working with SQL, JSON, and JavaScript, which are essential for handling data in various formats and for scripting within certain platforms.

2. AWS Technologies :

Familiarity with AWS S3 for data storage and retrieval.

Experience with AWS Glue for managed ETL services.

Proficiency in querying data with AWS Athena using SQL.

Knowledge of AWS Lambda for serverless compute in automating ETL workflows.

Understanding of Terraform for building, changing, and versioning infrastructure, particularly for setting up data pipelines on AWS.

3.Data Modeling:

Experience with data modeling in Spark, as well as familiarity with Postgres and ElasticSearch.

Proficiency with the Parquet file format, optimized for big data processing frameworks like Spark.

Knowledge of NoSQL databases, particularly MongoDB, for handling unstructured or semistructured data.

4. Development Practices:

A developer mindset with a strong grasp of version control concepts, including repositories,branches, and versioning using Git.

Understanding of modularity, with the ability to split work into reusable components, particularly within analysis and tools like Slate.

Experience with Agile methodologies, particularly Scrum, and the ability to work in a fast-paced, iterative development environment.

Familiarity with DevOps practices, including continuous integration and continuous deployment (CI/CD), and the use of tools to automate development processes.

Desirable Skills:

UI Design: While not essential, experience in UI design is considered a plus.

Palantir Experience: Experience with Palantir technologies is of great advantage, as it indicates familiarity with their data integration, analysis, and operational platforms.

We are looking for candidates who can bring a developer's approach to organizing work, understand the importance of modularity, and have a track record of using high-level Python skills in conjunction with AWS technologies to manage and analyze data effectively. The ideal candidate will also be comfortable with Git, Scrum, and DevOps practices, contributing to a collaborative and efficientdevelopment process.

 

Skills & Requirements

AWS ( Mandatory ), Python, SQL, JSON, ETL and JavaScript

Join Our Community

Let us know the skills you need and we'll find the best talent for you