Skip to main content
Weightless
ST
Engineering

Azure Data Fabric Engineer

SLD TECHNOLOGIES LLC

Full-Time
Lead
$109k – $131k/yr
Remote
Posted 2d ago

Tech Stack

PythonJavaAWSAzureRESTSparkApache

Job Description

Job Summary We are seeking a dynamic and innovative Azure Data Fabric Engineer to lead the design, development, and implementation of scalable data solutions within the Azure ecosystem. This role offers an exciting opportunity to work at the forefront of data engineering, leveraging cutting\-edge technologies to transform complex data landscapes into actionable insights. As an Azure Data Fabric Engineer, you will be instrumental in building robust data architectures that support analytics, machine learning, and business intelligence initiatives. Your expertise will drive efficiency, scalability, and security across our data platforms, empowering teams to make data\-driven decisions with confidence. Responsibilities * Design and develop end\-to\-end data pipelines using Azure Data Lake, Spark, Hadoop, and other big data tools to facilitate efficient data ingestion, processing, and storage. * Implement ETL (Extract, Transform, Load) processes utilizing tools such as Talend, Informatica, and custom Python scripts to ensure high\-quality data flow across systems. * Build and optimize data models within data warehouses using SQL Server, Oracle, and other relational databases to support analytics and reporting needs. * Develop RESTful APIs for seamless integration between various data services and applications, ensuring secure and scalable communication channels. * Collaborate with cross\-functional teams to gather requirements and translate them into technical solutions aligned with Agile methodologies. * Utilize analysis skills to troubleshoot issues in data pipelines, improve performance, and ensure data integrity across platforms like Apache Hive, Spark, and Azure Data Lake. * Support model training and analysis efforts by preparing datasets and automating workflows for predictive analytics projects. Qualifications * Proven experience working with cloud platforms such as Microsoft Azure and AWS for large\-scale data solutions. * Strong proficiency in Java, Python, Bash (Unix shell), Shell Scripting, and other scripting languages for automation and development tasks. * Extensive knowledge of big data technologies including Hadoop ecosystem components (HDFS, Hive), Spark, Apache Hive, and related tools. * Hands\-on experience with SQL Server, Oracle databases, Data Warehouse design principles, and database modeling techniques. * Familiarity with ETL tools such as Talend or Informatica for efficient data processing workflows. * Experience working with Looker or similar BI tools for creating dashboards and visualizations that enable insightful analysis. * Understanding of linked data concepts and semantic web principles to enhance interoperability across datasets. * Knowledge of analysis techniques for model training and predictive analytics projects. * Ability to work within Agile teams while demonstrating excellent problem\-solving skills in complex technical environments. Join us to be part of a forward\-thinking team dedicated to harnessing the power of data through innovative cloud\-based solutions! Pay: $108,665\.20 \- $130,865\.61 per year Work Location: Remote

Get jobs like this in your inbox

Join thousands of digital nomads getting the best remote jobs delivered weekly. Free, no spam.