Data Engineer

  • Location:

    Houston, Texas 77042 - United States

  • Sector:

    Business Intelligence/Analytics

  • Job ref:

    1618

  • Contact:

    Michael Hansen

  • Expiry date:

    2024-05-22

  • Published:

    2 weeks ago

  • Location: Houston, Texas
  • Type: Contract
  • Job #1618

We are seeking a skilled and experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for delivering consulting projects and working collaboratively to accomplish client goals. Your role will involve developing and implementing technical best practices for data integration, ETL processes, and data engineering activities. Additionally, you will be responsible for maintaining a multi-terabyte enterprise data warehouse and building the necessary infrastructure for optimal data extraction, transformation, and loading from various sources using SQL and modern cloud technologies.

Responsibilities:

  • Deliver consulting projects on-time, on-budget, and in a manner that accomplishes client goals.
  • Develop and implement technical best practices for data ingestion, data quality, data cleansing, and other data integration/ETL/engineering-related activities.
  • Maintain a multi-terabyte enterprise data warehouse with accompanying incremental data pipelines.
  • Build infrastructure required for optimal extraction, transformation, and loading of data from diverse sources using SQL and modern cloud technologies.
  • Conduct or participate in meetings with owners of key system components to fully understand current data and systems environments.
  • Resolve source data issues and refine transformation rules.
  • Analyze source system data to assess transformation logic and data quality through data profiling.
  • Leverage data quality processes to assist with data cleansing requirements.
  • Work with technical and business representatives to determine strategies for handling data anomalies.
  • Design ETL processes and develop source-to-target data mappings, integration workflows, and load processes.
  • Develop, test, integrate, and deploy data pipelines using a variety of tools and external programming/scripting languages as necessary.
  • Provide technical documentation and other artifacts for data pipelines, ingestion, integration, or other data solutions.
  • Identify problems, develop ideas, and propose solutions within differing situations requiring analytical, evaluative, or constructive thinking.
  • Apply creative thinking to identify possible reporting solution alternatives.
  • Perform other duties as assigned.

Requirements and Qualifications:

  • Minimum of 3 years of hands-on experience with one or more of the following data integration/ETL tools: Azure Data Factory, Databricks/Spark.
  • Experience in building on-prem data warehousing solutions.
  • Experience with designing and developing ETL processes, Data Marts, and Star Schemas.
  • Experience in building data warehousing solutions in Azure and moving data from on-prem to cloud.
  • Experience in designing a data warehouse solution using Synapse or Azure SQL DB.
  • Experience building pipelines using Synapse or Azure Data Factory to ingest data from various sources.
  • Understanding of integration run times available in Azure.
  • Advanced working knowledge of SQL and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases.
  • Knowledge of scripting languages like Python and Scala.
  • Microsoft Azure Cloud platform certifications (nice to have).
  • Willingness to travel to client locations based on project needs.

If you are a self-motivated individual with a strong technical background in data engineering and a passion for delivering high-quality solutions, we encourage you to apply. Join our team and be part of exciting projects that push the boundaries of data engineering and integration.

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!