Careers at RxSense
Our culture is more than a set of values, principles and promises. It’s the dynamic embodiment of who we are, where we’ve been and where we’re headed tomorrow.
Lead Data Engineer
- Job Title
- Lead Data Engineer
- Job ID
- Boston, MA, MA
- Other Location
- New Jersey
Lead Data Engineer
RxSense is a Boston-based health technology company offering powerful technology solutions built to make prescription drugs more affordable for all. We operate two business units – Enterprise Platform Solutions (EPS), which provides turnkey technology services to employers, health plans, and others who wish to administer pharmacy benefits, and Direct to Consumer (DTC), which offers cash pay prescription benefits to people under the consumer brand SingleCare.
Incorporated in 2014, RxSense is one of the fastest-growing companies in New England. We are privately held and proud to have been recognized by Fast Company as one of 2020’s Most Innovative Companies, and by Forbes as a 2020 Best Startup Employer. Our CEO and Founder, Rick Bates, was named the EY New England Entrepreneur of the Year award winner in 2018.
- Provide direction and lead our data engineering and architecture team, by determining optimal approach to the business demands.
- Implementing Data Analytics best practices in designing data modeling, ETL Pipelines, Near time data solutions.
- Coordinate with Business Analyst to validate requirements, perform interviews with users and developers.
- Implementing solutions to integrate external data with in-house data.
- Perform tests and validate data flows and prepare ETL processes according to business requirements.
- Designing multi-tenancy data platform.
- Lead Technical architecture discussions and help drive technical implementations.
- Designing and implement a data conversion strategy from legacy to new platforms.
- Perform design validation, reconciliation and error handling in data load processes.
- 10+Years experience in Data Analytics & Business Intelligence projects
- Extensive knowledge of BI concepts (ETL, Dimensional Modeling, Data warehouse Design)
- Experience in creating Data Pipelines using Python/Spark, and using traditional ETL/ETL tools like Informatica, Pentaho
- Experience in designing Columnar Databases like Vertica, RedShift or Snowflake
- Experience in building monitoring & alerting mechanisms for data pipelines
- Working Knowledge in developing Integrations using APIs, such as REST, SOAP, JDBC/ODBC connections.
- Strong Analytical and Problem-solving skills, Excellent Written and Verbal Communication skills.
- Preferably AWS architect.
- Nice to Have – Experience in implementing Data Streaming process using Kafka or Kinesis and NoSql Databases.