Responsibilities
- Design, develop and support multiple data projects in traditional relational databases such as PostgreSQL as well as non-traditional database such as HIVE and Vertica
- Analyze business requirements, design and implement required data model and ETL/ELT processes on your own
- Lead data architecture and engineering decision making/planning
- Translate complex technical subjects into terms that can be understood by both technical and non-technical audiences
Qualifications (must have)
- 5+ years of experience with database development on Oracle, MSSQL or PostgreSQL
- 3+ years on one specific ETL tool, such as Pentaho, Talend, Informatica, DataStage
- 3+ years data modeling experience, both logical and physical
- Strong communication and documentation skill is absolutely required for this role as you will be working directly with both technical and non-technical teams
- Experience working closely with teams outside of IT (ie. Business Intelligence, Finance, Marketing, Sales)
- Experience with setting up the infrastructure and architectural requirements
- Requires minimal or no direct supervision
Desired (nice to have)
- Working knowledge with big data databases such as Vertica, Snowflake or Redshift
- Experience on the Hadoop ecosystem. Programmed or worked with key data components such as HIVE, Spark and Sqoop moving and processing terabyte level of data
- Web analytics or Business Intelligence a plus
- Understanding of Ad stack and data (Ad Servers, DSM, Programmatic, DMP, etc)
- Knowledge of scripting languages such as Perl or Python
- Hadoop administration experience a plus, but not required
- Exposure or understanding of scheduling tools such as Airflow
- Experience in Linux environment is preferred but not mandatory