Data Engineer - Hybrid - SimioCloud Engineering - Manchester, NH at Geebo

Data Engineer - Hybrid - SimioCloud

DescriptionAbout Us SimioCloud, a Moore Company, helps non-profits organizations fulfill their missions by delivering best-in-class data and software solutions.
We are developing an innovative platform that leverages vast consumer data with the latest machine learning and artificial intelligence to drive fundraising and marketing optimization for nonprofits across all channels.
Our clients range from startups to some of the largest charities in the world.
We're looking for an individual who will help lead the development efforts of the software engineering team.
They will engage with the Data Science, Data Engineering, and Strategic Analytics team to enhance and support an in-house data science application as well as helping spearhead other development efforts.
RequirementsJob
Summary:
As a Data Engineer, you will help us organize, manage and ETL various data elements.
You will be responsible for running ad hoc and prebuilt analytic products in a cloud-based environment as well as directly contributing to the development and enhancements of those products.
You will learn and be expected to understand the business needs of our data, how we apply it and what we build from it.
In this role you will have visibility and growth into many different areas of Analytics, Data Science and Engineering Development.
You will be challenged to learn new things and your input to make enhancements is expected.
Duties/
Responsibilities:
Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects, including integrating new sources of data into our central data warehouse, and moving data out to applications and affiliates.
Produce scalable, replicable code and engineering solutions that help automate repetitive data management tasks.
Create automated reports and BI tools that bring insights into the status of all SimioCloud data to be used for, but not limited to:
Quality control Daily, weekly, and monthly aggregate and granular changes in the data Success of ongoing campaigns Implement data retention policies to both lower costs, increase computational efficiency, and comply with internal and external demands.
Increase automated process efficiency to both increase downstream productivity and lower costs.
Analyze databases in different aspects such as data integrity, data model, and performance to increase productivity and decrease costs at an enterprise level.
Work in cross-functional, geographically distributed agile teams of highly skilled delivery teams to continuously innovate solutions.
Performs other duties as required.
Required Skills/Abilities:
Strong command of relational databases and SQL.
Extract, Load, and Transform (ELT) data into a relational database.
Proficiency with Azure Data Factory or related cloud based ELT tools.
Demonstrated ability to work independently and be a self-starter.
You are open, collaborative, and comfortable working in an evolving Scrum Agile atmosphere.
Ability to convey technical concepts to non-technical audiences.
Ability to work on several lanes of work at a single time.
Strong leadership and communication skills.
Ability to visualize and communicate architecture and development concepts.
Education and
Experience:
3
years implementing large scale data projects within a cloud environment.
Including interactions with cloud storage systems, service apps, and automated delivery solutions.
3
years of experience with Cloud databases - Snowflake, Azure SQL DW, AWS Redshift or similar 3
years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data lake solutions.
Automated tests on your code and data feels right.
Nice to Have:
Experience with BI tools such as PowerBI, Tableau, or similar Experience with Python for data extraction and manipulation.
Recommended Skills Agile Methodology Amazon Redshift Analytical Architecture Automation Azure Data Factory Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.