INTERNSHIP DETAILS

Data Scientist Intern

CompanyPlymouth Rock Assurance
LocationWoodbridge Township
Work ModeOn Site
PostedJanuary 12, 2026
Internship Information
Core Responsibilities
The intern will support the design and optimization of data pipelines and contribute to machine learning workflows. They will also assist in data quality checks and collaborate with data scientists to automate data processes.
Internship Type
intern
Company Size
2185
Visa Sponsorship
No
Language
English
Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page

About The Company
If you’ve found your way to our LinkedIn page, you’re likely here for one of two reasons: You’re reviewing insurance companies or you’re browsing new career opportunities. Well, you’ve come to the right place. Insurance Customer In its 40 years, Plymouth Rock Assurance has grown to be one of the leading auto and home insurers in the Northeast. With unique features like Crashbusters® mobile claims vans, Door to Door Valet Claim Service®, the Get Home Safe® taxi and rideshare benefit, Road Rewards® and our Home Insurance Quick Quote, we’re committed to delivering our customers “More Than Just Insurance.” To get a free quote, visit plymouthrock.com or talk to a Plymouth Rock agent. Job Seeker We take pride in both the strength and commitment of our Plymouth Rock Assurance team. We actively seek individuals who exhibit friendliness, integrity, loyalty, hard work and the pursuit of excellence. If you’re currently looking for a new employment opportunity, we hope you’ll consider Plymouth Rock. For a full list of job openings, please visit www.plymouthrock.com/about/careers.
About the Role

 

As a Data Scientist Intern, you will work on cutting-edge analytical and data engineering projects that drive measurable business impact across pricing, underwriting, marketing, and claims.

This internship is ideal for a technically curious, motivated problem-solver who wants hands-on data science experience.

 

RESPONSIBILITIES

  • Support the design, construction, and optimization of robust data pipelines to enable machine learning and analytical modeling.
  • Contribute to the design and implementation of data and ML workflows using orchestration tools such as Dagster, Airflow, or similar frameworks.
  • Help implement data quality checks, validation routines, and monitoring for automated data workflows.
  • Assist in organizing and managing internal GitHub repositories to standardize ML project structures and best practices.
  • Collaborate with data scientists and engineers to automate the ingestion, transformation, and delivery of data for model development.
  • Contribute to initiatives migrating analytical processes into cloud-based data lake architectures and modern platforms such as AWS or Snowflake.
  • Develop reusable and well-tested code to support analytical pipelines and internal tools using Python and SQL.
  • Conduct data mining, cleansing, and preparation tasks to build high-quality analytical datasets.
  • Participate in model development, including data profiling, model training, validation, and interpretation.
  • Build and evaluate predictive models that enhance profitability through improved segmentation and estimation of insurance risk.
  • Assist in studies evaluating new business models for customer segmentation, retention, and lifetime value.
  • Collaborate with business leaders to translate insights into operational improvements and cost efficiencies.

 

QUALIFICATIONS

  • Currently pursuing or recently completed a Master’s in Data Science, Computer Science, Statistics, Economics, or related field.
  • Proficiency in Python (Pandas, NumPy, Scikit-learn, XGBoost, or PyTorch) and SQL.
  • Understanding of data engineering concepts, ETL/ELT workflows, and machine learning deployment.
  • Exposure to workflow orchestration tools (e.g., Airflow, Dagster, Prefect) and Git/GitHub for collaborative development.
  • Familiarity with Docker, CI/CD pipelines, and infrastructure-as-code tools such as Terraform preferred.
  • Knowledge of AWS cloud services such as S3, Lambda, EC2, or SageMaker a plus.
  • Experience with common modeling techniques (e.g., GLM, tree-based models, Bayesian statistics, NLP, deep learning) through coursework or projects.
  • Strong analytical, communication, and problem-solving skills.
  • A self-starter mindset, with attention to detail and enthusiasm for learning new technologies.

 

SALARY RANGE

The pay range for this position is $35 hourly.

 

ABOUT THE COMPANY

The Plymouth Rock Company and its affiliated group of companies write and manage over $2 billion in personal and commercial auto and homeowner’s insurance throughout the Northeast and mid-Atlantic, where we have built an unparalleled reputation for service. We continuously invest in technology, our employees thrive in our empowering environment, and our customers are among the most loyal in the industry. The Plymouth Rock group of companies employs more than 1,900 people and is headquartered in Boston, Massachusetts. Plymouth Rock Assurance Corporation holds an A.M. Best rating of “A-/Excellent”.

#LI-DNI

Key Skills
PythonSQLData EngineeringETLMachine LearningWorkflow OrchestrationGitDockerAWSData MiningData CleansingPredictive ModelingAnalytical DatasetsCommunicationProblem-SolvingAttention to DetailSelf-Starter
Categories
Data & AnalyticsTechnology