Search JobsUpload CVSend Brief

Data Engineer

Location
LocationRemote
Salary
Salary200,000
Published
Published17th March
Sector
SectorCommercial Real Estate Innovation
Job Type
Job TypePermanent
Consultant
ConsultantGary Still

PropTech Data Engineer role based Remotely in New York, US

Job Summary

A Data Engineer, or Data Systems Engineer you’ll be responsible for developing and maintaining data processing software that runs our client’s ML Algorithms. Your duties include coordinating with company Executives and other professionals to create unique data infrastructure, running tests on their designs to isolate errors and updating systems to accommodate changes in company needs.

Responsibilities
  • Assembling large, complex sets of data that meet non-functional and functional business requirements
  • Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes
  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
  • Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition
  • Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues
  • Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues
Skills and Experience
  • 5-6+ years of relevant experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
  • 3+ years of experience with Cloud based DW such as Redshift, Snowflake etc.
  • 3+ years’ experience in Big Data Distributed ecosystems (Hadoop, SPARK, Hive & Delta Lake)
  • 3+ years experience in Workflow orchestration tools such as Airflow etc.
  • 3+ years’ experience in Big Data Distributed systems such as Databricks, AWS EMR, AWS Glue etc.
  • Big Data file formats (Parquet, Avro, Delta Lake)
  • 3+ years experience in Workflow orchestration tools such as Airflow etc.
  • 3+ years’ experience in Big Data Distributed systems such as Databricks, AWS EMR, AWS Glue etc.
  • Leverage monitoring like Splunk, Grafana, CloudWatch etc.
  • Experience with container management frameworks such as Docker, Kubernetes, ECR etc.
  • 3+ year’s working with multiple Big Data file formats (Parquet, Avro, Delta Lake)
  • Experience working on CI/CD processes such as Jenkins, Codeway etc. and source control tools such as GitHub, etc.
  • Strong experience in coding languages like Python, Scala & Java *Knowledge and Skills*
  • Fluent in relational based systems and writing complex SQL.
  • Fluent in complex, distributed and massively parallel systems.
  • Strong analytical and problem-solving skills with ability to represent complex algorithms in software.
  • Strong understanding of database technologies and management systems.
  • Strong understanding of data structures and algorithms
  • Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools.
  • Strong analytical and problem-solving skills. *Nice to Have*
  • Experience with transformation tools such as dbt.
  • Have experience in building realtime streaming data pipelines
  • Experience in pub/sub streaming technologies like Kafka, Kinesis, Spark Streaming etc.
Qualities
  • Experience working on CI/CD processes such as Jenkins, Codeway etc. and source control tools such as GitHub, etc.
  • Strong experience in coding languages like Python, Scala & Java *Knowledge and Skills*
  • Fluent in relational based systems and writing complex SQL.
  • Fluent in complex, distributed and massively parallel systems.
  • Strong analytical and problem-solving skills with ability to represent complex algorithms in software.
  • Strong understanding of database technologies and management systems.
  • Strong understanding of data structures and algorithms
  • Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools.
  • Strong analytical and problem-solving skills. *Nice to Have*
  • Experience with transformation tools such as dbt.
  • Have experience in building realtime streaming data pipelines
  • Experience in pub/sub streaming technologies like Kafka, Kinesis, Spark Streaming etc.
Education

Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering or equivalent

LMRE Tech consultant

Similar Jobs

Back to all jobs

Decoration lines

Get in touch with LMRE
Rocket

We are always seeking new talent for great opportunities

Accepted file types: pdf, Max. file size: 8 MB.
Consent*
Search jobs
Website byMadison Web Solutions logo