The company is a global leader in partnering with businesses to transform and manage their operations by leveraging the power of technology. The group is driven daily by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization with 350,000 members in over 50 countries.
As Sr. Data Engineer, you will work on one of the world’s largest social media platform which deals with a few petabytes of data coming to the system daily. You will contribute as part of R&D self-organized team working in a challenging, innovative environment for our client.
Investigate, create, and implement the solutions for many technical challenges using cutting edge technologies, including building/enhancing Data processing platform enabling work of software used by hundreds of millions of users.
Main tasks an responsibilities:
● Collaborate with cross-functional stakeholders to collect data requirements.
● Develop data pipelines using python to import data into snowflake.
● Ensures that assigned area/areas are delivered within set deadlines and required quality objectives.
● Provides estimations, agrees task duration with the manager and contributes to project plan of assigned area.
● Analyzes scope of alternative solutions and makes decisions about area implementation based on his/her experience and technical expertise.
● Reports about area readiness/quality, and raise red flags in crisis situations which are beyond his/her AOR.
● Responsible for resolving crisis situations within his/her AOR.
● Initiates and conducts code reviews, creates code standards, conventions, and guidelines.
● Suggests technical and functional improvements to add value to the product.
● Constantly improves his/her professional level.
Must have:
● University degree in Computer Related Sciences or similar
● 7+ years of experience as Data Engineer in creating data pipelines for a cloud data warehouse.
● Strong Python and SQL skills.
● Experience building data pipelines in Python.
● Experience with extracting data from Rest APIs and ingesting it to a cloud data warehouse.
● Experience working with S3 buckets and writing DAGs on Airflow.
● Experience programmatically working with any cloud data warehouse.
● High code quality, automated testing, and other engineering best practices.
● Effective communication, collaboration, and interpersonal skills
● Result oriented approach.
● Good English (oral & written) and communication skills in general.
Would be a plus:
● Experience in AWS.
Benefits:
● OSDE 210 family health plan.
● Birthday day off.
● Continuous training through content platforms.
And more!
Great, just keep talking to your recruiter.
Apply for this position
If you are already talking to a recruiter from CONEXIONHR, DON'T FILL THE FORM.