About This Role
Hello prospective pickle! Design Pickle is looking for a Data Engineer to join our team and help us develop new ways to inspire our customers and streamline processes for our global network of creatives. You will be tasked to build and maintain the right data pipelines and data models to power our decision-making and enable actionable insights.
The ideal candidate will be able to create efficient, flexible, extensible, and scalable data models, ETL designs, and data integration services. They will also be required to support and manage the growth of these data solutions. Given our aspirational vision, to be the most helpful creative platform in the world, and the nature of our products, this role requires entrepreneurial drive and thinking, comfort with ambiguity, and the ability to break down and solve complex problems.
If you have ever wanted to make a significant contribution and help shape the trajectory of a startup, this role is for you!
Reports to: Director Data Science & Analytics
On a daily basis, works closely with: Engineering, Product Management, Product Marketing and Global Operations.
Location: Design Pickle is a fully remote company with a Company Hub in Scottsdale, Arizona.
Who We Are Looking For
First, Design Pickle is anything but typical. We’re a group of hard-working, creativity-loving individuals from around the world.
Do we love pickles, too? Most of us! But don’t stress if pickles aren’t your thing. It’s not a deal-breaker. We do look for a passion and interest in something though because our employees’ uniqueness is what helped make us the great company we are today.
We stand by our vision, purpose, and values, and these are mission-critical to how you show up every single day.
Specific to your role, we’re looking for individuals who have...
- A robust background with at least two years dedicated to software development, encompassing the full spectrum of the product lifecycle. This includes ideation, development, deployment, and iteration.
- A minimum of three years' expertise in crafting and optimizing SQL queries. Candidates should be well-versed in manipulating and extracting data to meet business needs.
- Over two years of hands-on experience in ETL (Extract, Transform, Load) processes, showcasing proficiency in designing, implementing, and maintaining robust ETL pipelines.
- At least two years of programming experience with a focus on object-oriented languages, such as Python.
- A minimum of two years in database schema design and dimensional data modeling, illustrating a deep understanding of how to structure and model data effectively for scalability and performance.
- Proven experience in the data warehousing field, indicating a solid foundation in managing large-scale data storage solutions.
- Demonstrated ability to analyze datasets to uncover discrepancies and inconsistencies, thereby ensuring data quality and reliability.
- Practical experience with Amazon Web Services (AWS), including but not limited to S3, Redshift, and Machine Learning services. Candidates should be comfortable leveraging these services to enhance data storage, processing, and analytics capabilities.
- Expertise in managing and clearly communicating plans for data sourcing and pipeline development to stakeholders within the organization, ensuring alignment and understanding across teams.
- Exceptional problem-solving abilities, with a knack for navigating through unclear requirements and delivering effective solutions.
Bonus Pickle Points:
- A Bachelor's or Master's degree in Computer Science, a related technical field, or equivalent practical experience.
- Additional experience with AWS, specifically in managing Data Lakes, is highly regarded.
- Familiarity with building and utilizing reports in business intelligence tools such as PowerBI and Tableau, enhancing decision-making and insights.
- Proficiency in Ruby on Rails, adding value through versatile web development skills.
- A proven track record of working independently within globally distributed teams, showcasing effective communication and collaboration across different time zones.
- Demonstrated capacity to leverage data in influencing pivotal business decisions, underlining the strategic use of insights in driving outcomes.
Key Objectives and Responsibilities
As a fast-growing company, our roles are always evolving. However, we want you to know exactly what you’re walking into. In the first 90-days, here is a preview of what’s expected:
- Conceptualize and own the data architecture for our suite of tools and analytics platform.
- Create and contribute to frameworks that improve the efficiency of logging data, while working with data infrastructure to troubleshoot and resolve issues.
- Collaborate with engineers, product managers, product design and product marketing to understand data needs, representing key data insights in a meaningful and actionable way.
- Define and manage SLA for all data sets.
- Determine and implement the security model based on security and privacy requirements, confirm safeguards are followed, address data quality issues and evolve governance processes.
- Design, build and launch sophisticated data models and visualizations that support our products and global operational processes.
- Solve data integration problems, utilizing ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources.
- Optimize pipelines, dashboards, frameworks, and systems to streamline development of data artifacts.
- Mentor team members for best practices in the data engineering space.
- Commitment to documentation.
$100,000 - $115,000 a year
The compensation range for this position $100,000 to $115,000 annually. The actual salary offer to a candidate will be made with mindful consideration of many factors. These factors include but are not limited to skills, qualifications, education/knowledge, experience, and alignment with market data for a given location within the US. In addition to base salary, some positions may be eligible for additional forms of compensation such as bonuses or commissions. This salary data is for our US-based positions only.
Please mention the word **FLATTERINGLY** and tag RMTg4LjE2Ni4xMDAuMTkx when applying to show you read the job post completely (#RMTg4LjE2Ni4xMDAuMTkx). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
Tags
design
amazon
security
technical
support
software
growth
web
ruby
operational
marketing
analytics
engineer
engineering
digital nomad
Apply to job