cover image
R Systems

R Systems

www.rsystems.com

5 Jobs

6,221 Employees

About the Company

R Systems is a leading digital product engineering company that designs and builds next-gen products, platforms, and digital experiences empowering clients across various industries to overcome digital barriers, put their customers first, and achieve higher revenues as well as operational efficiency.

We constantly innovate and bring fresh perspectives to harness the power of the latest technologies like cloud, automation, AI, ML, analytics, Mixed Reality etc. Our 4,300+ technology expeditioners across 28 offices are driven to explore new digital paths, leaving no stone unturned in our quest to deliver business solutions that drive meaningful impact.

Our product mindset, capabilities and tools allow us to partner with Tech industry which is no longer limited to ISV and SaaS companies, but also include Telecom, Media, FinTech, InsureTech and HealthTech players, and enable faster new feature release with full ownership and integration into the CI-CD pipeline.

Listed Jobs

Company background Company brand
Company Name
R Systems
Job Title
Java Application Developer with AWS - Remote position
Job Description
**Job Title** Java Application Developer – AWS (Remote) **Role Summary** Design and implement Java-based replatform migrations to AWS, ensuring backward-compatible upgrades of existing applications and middleware. Build and maintain CI/CD pipelines, containerize workloads, and manage deployment across AWS services. Facilitate end‑to‑end migration lifecycle, including testing, validation, cutover, and performance optimization. **Expectations** - Lead migration projects for Java applications, maintaining functionality while upgrading to modern platforms. - Deliver quality releases on schedule, leveraging automation tools and best practices. - Collaborate cross‑functionally with infrastructure, QA, and ops to ensure smooth cutovers and production readiness. **Key Responsibilities** - Upgrade and replatform Java apps (Java 8/11/17) and associated middleware (WebLogic, WebSphere, JBoss/WildFly, Tomcat). - Adapt application and configuration components for AWS Outposts. - Develop, maintain, and optimize CI/CD pipelines using Jenkins/Bamboo, Maven/Gradle, Git, Terraform, Docker, and Kubernetes (ECS/EKS). - Containerize applications and manage deployment orchestrations. - Work with relational databases (SQL Server, Oracle, MySQL/PostgreSQL). - Conduct performance tuning, troubleshoot migration‑related issues, and validate production readiness. - Document processes, transfer knowledge, and support post‑deployment stabilization. **Required Skills** - Proficient in Java 8/11/17 with Spring Framework (Boot/MVC). - Experience with ORM (Hibernate), build tools (Maven/Gradle), scripting (Bash/Python/Groovy). - Strong grasp of CI/CD (Jenkins/Bamboo), Git workflows, and Terraform for IaC. - Competence in Docker, Kubernetes (ECS/EKS preferred). - Familiarity with AWS services, ELK/Splunk/AppDynamics, and XML/YAML configuration. - Debugging, performance tuning, and migration testing expertise. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field. - AWS certifications (e.g., AWS Certified Developer, Solutions Architect) are advantageous but not mandatory.
United states
Remote
26-11-2025
Company background Company brand
Company Name
R Systems
Job Title
Fullstack Developer
Job Description
**Job Title** Fullstack Developer **Role Summary** Design, develop, and maintain end-to-end web applications. Leverage modern front‑end frameworks (React, Angular, Vue.js) and back‑end technologies (Node.js, Python, Java, .NET) to deliver scalable, responsive user interfaces and robust server‑side logic. Utilize Azure cloud services for application deployment, monitoring, and support of data‑science projects. **Expectations** - Minimum 2 years’ professional experience in full‑stack development. - Bachelor’s degree in Computer Science, IT, or related field, or equivalent practical experience. - Proven ability to work independently and in collaborative, agile teams. - Strong problem‑solving, communication, and documentation skills. **Key Responsibilities** - Build and maintain UI components with React/Angular/Vue.js and static web technologies. - Design, implement, and expose RESTful APIs; manage SQL and NoSQL databases. - Integrate third‑party services and data‑science dashboards. - Write unit, integration, and end‑to‑end tests; debug and optimize code for performance and scalability. - Deploy applications to Azure (or comparable cloud), configure CI/CD pipelines, and handle scaling, monitoring, and rollback procedures. - Participate in code reviews, sprint planning, daily stand‑ups, and retrospectives. - Contribute to DevOps practices and DevSecOps principles. **Required Skills** - Front‑end: HTML5, CSS3, JavaScript, ES6+, React/Angular/Vue.js. - Back‑end: Node.js, Python, Java, or .NET with experience in REST API design. - Cloud: Azure full‑stack development; familiarity with AWS or GCP is a plus. - Databases: SQL (e.g., SQL Server, PostgreSQL) and NoSQL (e.g., MongoDB). - Version control: Git (repository management, branching strategies). - Testing: Jest, Mocha, Cypress, or equivalent frameworks. - Containers & Orchestration: Docker and Kubernetes knowledge is desirable. - Security: Understanding of OWASP guidelines and secure coding best practices. - Agile methodologies: Scrum or Kanban process experience. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Technology, Software Engineering, or equivalent. - Cloud certifications (e.g., Microsoft Certified: Azure Developer Associate) are preferred but not mandatory.
Denver, United states
On site
Junior
08-12-2025
Company background Company brand
Company Name
R Systems
Job Title
Data Architect (AI/ML) - Remote position
Job Description
**Job title** Data Architect (AI/ML) – Remote **Role Summary** Architect and govern end‑to‑end data and AI platforms, ensuring scalable, secure, and high‑performance data pipelines that enable rapid deployment of machine‑learning models across cloud environments. Manage offshore data engineering and Power BI teams, enforce coding standards, and drive AI/ML, DataOps, and MLOps initiatives. **Expectations** - Available during U.S. business hours (Eastern/Pacific time). - Communicate clearly with business stakeholders and technical teams. - Quickly translate business and technical requirements into architecture and implementation plans. - Provide rigorous code‑review and quality assurance oversight for offshore delivery. **Key Responsibilities** 1. Design and maintain enterprise‑level data models, databases, and data warehouses (e.g., Snowflake). 2. Build and optimize ETL/ELT pipelines using tools such as Matillion, dbt, or custom scripts. 3. Configure and manage cloud data services on AWS, Azure, and GCP (S3, Redshift, BigQuery, Synapse, etc.). 4. Lead AI/ML and generative‑AI strategy, including model training, deployment, and monitoring. 5. Implement DataOps and MLOps practices (CI/CD pipelines, automated testing, model versioning). 6. Oversee code reviews, performance tuning, debugging, and quality checks for offshore data engineering and Power BI deliverables. 7. Mentor and coach offshore teams on architecture, best practices, and emerging technologies. 8. Ensure data security, governance, and compliance across all platforms. **Required Skills** - Expertise in data modeling, relational and columnar databases. - Advanced SQL and experience debugging complex queries. - Cloud platform proficiency (AWS, Azure, GCP). - Experience with Snowflake, Matillion, and similar modern data integration tools. - Strong foundation in AI/ML and generative‑AI application development. - Knowledge of DataOps and MLOps pipelines and tooling. - Familiarity with Microsoft Power BI and report/dashboard delivery. - Excellent written and verbal communication; ability to translate technical concepts for business stakeholders. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, or a related field (or equivalent experience). - Professional cloud certifications preferred: - AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect – Associate - Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert - Google Cloud Professional Data Engineer - Snowflake SnowPro Certified or similar data platform certification is a plus. - MLOps or AI fundamentals certification (e.g., Coursera, Udacity) is desirable.
United states
Remote
10-12-2025
Company background Company brand
Company Name
R Systems
Job Title
Data Scientist
Job Description
**Job Title:** Data Scientist **Role Summary:** Lead the design, development, and deployment of Generative AI solutions with Six Sigma rigor to improve network and customer operations. Integrate LLMs, RAG, and advanced modeling into scalable pipelines, ensuring statistical stability and high ROI. **Expectations:** - Deliver production‑ready AI models that drive measurable business outcomes. - Demonstrate Six Sigma Black Belt discipline in end‑to‑end data science workflows. - Translate technical insights into actionable strategies for executive stakeholders. **Key Responsibilities:** - Research, prototype, and implement LLMs and Retrieval‑Augmented Generation (RAG) for workflow automation. - Apply DMAIC methodology to diagnose process inefficiencies and deploy AI solutions. - Build, validate, and maintain predictive models (Churn, CLV, Propensity) using Python, PyTorch, and Scikit‑learn. - Optimize data pipelines for speed and inference efficiency in Databricks/AWS environments. - Monitor model performance through Design of Experiments, hypothesis testing, and Statistical Process Control. - Manage MLOps lifecycle: versioning, deployment, and monitoring in cloud environments. - Serve as a technical liaison between AI teams and executive leadership, conveying results in Six Sigma terms. **Required Skills:** - Generative AI: LangChain, LlamaIndex, Vector Databases (Pinecone, Milvus), model fine‑tuning. - Data Engineering: Advanced SQL, PySpark for large‑scale data preparation. - Statistical Analysis: DoE, hypothesis testing, SPC for drift detection. - MLOps: Model versioning, deployment on Azure/AWS. - Programming: Python, PyTorch, Scikit‑learn. **Required Education & Certifications:** - Master’s or PhD in a quantitative discipline (Statistics, Computer Science, Engineering). - Lean Six Sigma Black Belt or Six Sigma Black Belt certification. - 5–7+ years of data science leadership experience in AI‑driven environments.
Aurora, United states
On site
15-01-2026