INTERNSHIP DETAILS

Intern - Research Engineer

CompanySummitTX
LocationNew York
Work ModeOn Site
PostedApril 14, 2026
Internship Information
Core Responsibilities
The intern will design, build, and maintain systematic data pipelines and research-grade datasets to support alpha research and production signals. They will also collaborate with portfolio managers and analysts to operationalize models and translate investment hypotheses into testable research artifacts.
Internship Type
full time
Salary Range
$40 - $50
Company Size
81
Visa Sponsorship
No
Language
English
Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page

About The Company
Founded in 2015, SummitTX (formerly Crestline Summit) is an open architecture multi-manager, multi-strategy hedge fund manager. SummitTX manages over $3 billion across three investment solutions harnessing the same investment process and portfolio. Each solution has a unique risk/reward profile. SummitTX currently utilizes 1) Fundamental, 2) Tactical, 3) Quantitative, and 4) Capital Markets investment strategies. SummitTX is headquartered in Fort Worth, Texas, and has an office in New York. If you’re interested in learning more about our approach to investing or considering joining our team, then we want to hear from you. For Investors/Consultants: IR@summittxcapital.com For Prospective Portfolio Managers: BD@summittxcapital.com
About the Role

 

SummitTX Capital is a multi-manager, multi-strategy hedge fund managing over $3 billion in AUM. Founded in 2015, the firm spun out from Crestline Investors in 2025 to become an independent SEC-registered adviser under the SummitTX Capital brand. We operate an open-architecture platform across Fundamental, Tactical, Quantitative, and Capital Markets strategies, with offices in Fort Worth and New York.

SummitTX is seeking exceptional master’s candidates for our Research Engineer Internship beginning in the summer of 2026. This intern will help build and scale our systematic data platform that powers alpha research and production signals. You will work end-to-end, from idea generation and data acquisition to model development, backtesting, deployment, and monitoring, with an initial portfolio mix of Long/Short Equity initiatives and Systematic Fundamental research. The role reports to the Head of Data and partners daily with portfolio managers, analysts, the central research team, risk, and operations.

Key Responsibilities

 

  • Design, build, and maintain systematic data pipelines, including ingestion, medallion-style data modeling, feature engineering, and experiment tracking
  • Operationalize robust ELT workflows using DBT/SQL and Python on Databricks, with strong enforcement of data quality, lineage, and documentation
  • Develop research-grade datasets and features across market, alternative, and fundamental domains to support L/S Equity and systematic strategies
  • Productionize models and alpha signals with CI/CD pipelines, model registries, monitoring, and cost/performance optimization on Databricks and AWS
  • Partner with PMs and Analysts to translate investment hypotheses into testable research artifacts, delivering clear results, visualizations, and readouts to guide decision-making
  • Contribute to the evolution of the data platform roadmap, including observability, governance, access controls, cataloging, and documentation standards

 


Qualifications

 

  • BS or pursuing an MS in Data Science, Data Engineering, Statistics, Business Analytics, Applied Math, or related field with strong academic performance
  • Strong Python and SQL fundamentals; comfort with Git and testing frameworks
  • Coursework or internship experience in data modeling, ETL/ELT, artificial intelligence/machine learning/statistics, or time-series analysis
  • Clear communication skills and ability to partner with investment, risk, and operations stakeholders

 


Preferred

 

  • Hands-on experience with Python, SQL, DBT, Spark, and modern data-quality toolkits
  • Exposure to ML frameworks (pandas, scikit-learn, PyTorch, MLflow) and feature pipelines
  • Familiarity with Databricks (Lakehouse, Unity Catalog) and AWS data services (S3, Glue/Athena, Lake Formation)
  • Experience with visualization and BI tools (e.g., Plotly, Tableau/Power BI), and Financial Data Platform (e.g. Bloomberg Terminal)
  • Experience in GenAI/LLM applications (prompt engineering, agentic workflow, RAG)

 


Tech Stack

 

  • Languages & Frameworks: Python (Pandas, scikit-learn, PyTorch, MLflow), SQL, DBT, Spark
  • Data & Platform: Databricks (Delta Lake, Unity Catalog, Serverless Compute), DBT, AWS (EC2, S3, Athena), Bloomberg Terminal
  • Tooling & Ops: GitHub/Bitbucket, Databricks Lakeflow, Airflow, CI/CD pipelines, observability frameworks, Linux, Cursor/VS Code

Compensation

  • Base Compensation Range: $40 - 50/hr
  • Eligible for overtime

 

Key Skills
PythonSQLData EngineeringData ModelingETL/ELTDatabricksAWSMachine LearningStatisticsTime-series AnalysisGitSparkCI/CDData QualityFinancial ModelingAlpha Research
Categories
Finance & AccountingData & AnalyticsTechnologyScience & ResearchSoftware