7717 (12th Aug 2025) Recruitment Drive for Revature Consultancy Services Pvt. Ltd. | Hiring for “Multiple Position” as Fresher | Drive Organized by VibrantMinds Technologies Pvt. Ltd. |
Greetings from VibrantMinds Technologies Pvt. Ltd. – A
Campus Recruitment, Online Assessment & IT Training Solutions Company
NOTE:
- This recruitment drive is OPEN for all eligible and interested candidates across India. VibrantMinds / NON-VibrantMinds students are allowed to apply if fulfill the mentioned criteria.
- It’s a completely FREE OF COST Drive organized by VibrantMinds (No Charges to any candidate, anywhere).
- Shortlisted candidates must be available to attend the interview in person in Chennai during the first week of September.
Company Name: Revature
Consultancy Services Pvt. Ltd.
Position 1: Junior DevOps
Engineer
Position 2: Junior Denodo Developer
& Administrator
Position 3: Junior Data Engineer
Experience: Fresher
Approx. Package Post Internship: Upto Rs. 4,62,000/-
Per Annum.
Job Location: Pune / Hyderabad /
Bangalore / Chennai / Noida
Educational Criteria:
- Education: BE/ BTech/ ME/ MTech/ MCA (CS/IT/ECE
Branch ONLY)
- Batch: Pass out Year 2023/ 2024 &
2025 Batch Only.
- Percentage: No percentage Criteria.
Position 1: Junior DevOps Engineer
Job
Description:
We are expanding our Data
Practice and are hiring freshers and early-career professionals to join our Azure
DevOps Engineering team. You’ll contribute to the automation and reliability of
cloud infrastructure, CI/CD pipelines, and containerized workloads across
client environments.
Ideal for tech-savvy individuals
with strong fundamentals in cloud, scripting, and DevOps toolchains, this role
offers hands-on exposure to real-world cloud engineering and delivery.
Role &
Responsibilities:
- Support the setup and maintenance of Azure infrastructure using IAC tools like Terraform
- Configure and manage CI/CD pipelines on Azure DevOps
- Administer and automate Linux environments, shell scripting tasks, and Git workflows
- Deploy and maintain Dockerized microservices on Kubernetes clusters
- Implement Helm charts for application deployments
- Collaborate with teams to troubleshoot and monitor DevOps environments
- Follow Agile/DevOps methodologies and documentation standards
Skills
Required:
Cloud &
Azure Fundamentals
- Strong understanding of cloud computing, networking basics (IP, DNS, firewalls, etc.)
- Proficiency in Azure services: VNet, Storage Account, IAM (RBAC & Policies), Virtual Machines, Databases (SQL/NoSQL)
- Concepts of availability sets, scaling, and cost optimization
Version
Control & OS
- Experience with Git (branching, merge, pull requests)
- Linux fundamentals and ability to write/debug Shell Scripts
Azure DevOps
& CI/CD
- Experience or familiarity with:
- Azure Pipelines (classic or YAML-based)
- Artifact management, pipeline triggers, deployment strategies
- Service connections, approvals, and environments
Infrastructure
as Code
- Use of Terraform for provisioning Azure infrastructure
- Managing Terraform modules, state files, and Azure Provider configurations
Containerization
& Orchestration
- Docker (Dockerfiles, images, volumes, networking)
- Kubernetes (pods, deployments, services, ingress)
- Helm for packaging and deploying applications on Kubernetes
Nice-to-Have
(Bonus Points):
- Awareness of other cloud platforms (AWS/GCP)
- Exposure to tools like Ansible, Jenkins, Prometheus, or Grafana
- Participation in DevOps bootcamps, hackathons, or open-source contributions
- Certification in AZ-900, AZ-104, or Terraform Associate
Ideal
Candidate Profile:
- Strong logical reasoning and troubleshooting mindset
- Capable of reading technical documentation and implementing best practices
- Willing to work in client-facing, project-based environments
- Flexible with shifts or relocations if required by the project
Career
Growth:
- Opportunity to specialize in CloudOps, SRE, or Kubernetes Platform Engineering
- Continuous learning path via certifications and internal CoEs
- Work with clients across banking, telecom, energy, and healthcare sectors
Position 2: Junior Denodo Developer & Administrator
Job
Description:
We are looking for enthusiastic
and skilled Junior Denodo Developers/Administrators to join our Enterprise Data
Virtualization Practice. This role is ideal for candidates with foundational training
or certification in Denodo, along with a basic grasp of enterprise data access
and security frameworks. You will work in project teams enabling data
virtualization, integration, and security for global client applications.
This is an excellent opportunity
to get hands-on experience in Denodo Platform Development, Administration, and
Testing, especially suited for candidates interested in data engineering and
platform management roles.
Role &
Responsibilities:
- Create and configure data sources (JDBC, JSON/XML, Excel/CSV)
- Build Virtual Databases, Base Views, Derived Views, Flatten Views with mandatory filters
- Develop Selection Views, Publish Derived Views as REST/SOAP APIs
- Configure Access Controls for views using SQL and GUI-based permissions
- Create and manage Scheduler Jobs (VDP, VDPCache jobs)
- Write VQL Procedures, Java Stored Procedures, and Custom Functions
- Implement Row-level and Column-level Security
- Validate application access integrated with Denodo
- Apply Denodo Optimization Techniques: Cost-based optimization, Branch Pruning, etc.
- Understand and use various Cache Strategies: Full, Partial, Incremental
- Utilize SUMMARY and REMOTE tables appropriately for performance
- Understand Denodo platform architecture and components:
- Scheduler, Data Catalog, Diagnostic & Monitoring Tool
- Perform Basic Installation and Configuration of the Denodo platform
- Create Roles/Users and define row-level and column-level access policies
- Configure Memory Management, JVM Settings, and Resource Monitoring
- Conduct Load Testing using JMeter
- Manage environments using Denodo Solution Manager
- Perform Code Promotion & Backup using Bitbucket, Azure DevOps, Solution Manager
- Testing Tasks
- Configure and execute scripts using the Denodo Testing Tool (DTT Utility)
- Set up and maintain test automation for data view validations
- Understand parameters, test assertions, and output logging
Skills
Required:
- Completed Denodo Developer training/coursework
- Good understanding of SQL and data modeling concepts
- Familiarity with REST/SOAP APIs
- Strong logical reasoning and troubleshooting abilities
- Willingness to work in support, development, or admin mode depending on project needs
Nice-to-Have:
- Awareness of data integration patterns and ETL tools
- Exposure to Azure, AWS, or GCP cloud platforms
- Experience with JMeter, Git, Bitbucket, Jenkins, or related tools
- Knowledge of Agile delivery models
Career Progression:
- Opportunity to specialize in Data Virtualization Engineering, Platform Admin, or DataOps
- Work across BFSI, Healthcare, Telecom, and Manufacturing clients
- Grow into roles involving Data Governance, Architecture, or Cloud Data Platforms
Position 3: Junior Data Engineer
Job
Description:
Join our Data Modernization &
Cloud Integration practice as a Junior Data Engineer. You’ll work on enterprise-scale
data engineering initiatives, supporting client projects that involve Oracle,
Amazon Redshift, and Shell Scripting-based automation. If you’re passionate
about databases, scripting, and cloud-based data pipelines, this role offers
the perfect launchpad.
Role &
Responsibilities:
- Set up, monitor, and optimize Redshift clusters and Oracle environments
- Automate data ingestion using shell scripts and scheduled jobs
- Write complex SQL and PL/SQL queries for business reporting
- Collaborate with ETL teams and analysts to build robust data workflows
- Troubleshoot data loads and optimize queries for performance
- Participate in data migration, integration, and monitoring tasks
Skills
Required:
Amazon
Redshift
- Architecture & Setup: Node types, distribution styles, leader/compute roles
- Hands-on: Spin up clusters, configure storage, test parallel execution
- Data Loading: Using COPY from S3 (CSV/Parquet), IAM roles
- Performance Tuning: Vacuum, Analyze, WLM tuning, key selection
- Complex SQL: Joins, CTEs, subqueries, window functions
- ETL/Reporting Integration: Glue, Python scripts, Power BI, Tableau
- Hands-on: Build reporting queries, test ETL pipelines to Redshift
Oracle
- Architecture & Admin: Tablespaces, schemas, roles, privileges
- Hands-on: Create users, roles, assign storage quotas
- SQL & PL/SQL: Procedures, triggers, cursors, exception handling
- Data Import/Export: SQL*Loader, Data Pump
- Performance Tuning: Explain plans, indexes, AWR, partitioning
- Migration to Redshift: AWS DMS, CDC tools, export-load workflows
Shell
Scripting
- Core Scripting: Variables, loops, functions, conditionals
- File Parsing: grep, awk, sed, cut, sort, uniq
- Job Scheduling: cron syntax, logging, monitoring
- ETL Automation: Move files to S3, trigger Redshift COPY/Oracle INSERT
- Monitoring: Health check and alert scripts using mail/log functions
Nice-to-Have:
- Familiarity with AWS services (S3, DMS, Glue)
- Exposure to DevOps tools: Jenkins, Git, Docker
- Understanding of data lake or lakehouse architectures
- Basic Python scripting for ETL use cases
- Knowledge of BI tools like Power BI, Tableau
Ideal
Candidate Profile:
- Strong problem-solving and analytical thinking
- Exposure to academic/mini projects related to databases or scripting
- Effective communication and collaboration abilities
- Eagerness to learn and apply new tools, especially in a client delivery setting
- Open to working in shifts or high-demand environments based on client/project need
Career
Growth Path:
- Upskill to Cloud Data Engineer, ETL/ELT Specialist, or DB Migration Consultant
- Hands-on opportunities in AWS/GCP/Azure data platforms
- Work across BFSI, Retail, Healthcare, and Telecom domains
- Training on automation, performance engineering, and client-ready practices
Deadline to Apply: Wednesday, 13th Aug 2024 till 10.00 AM (The profiles won’t be considered after the deadlines)
How to Apply: https://forms.gle/B39DXyBBgNsNGkFB7
Regards,
VibrantMinds Technologies Pvt. Ltd.
Visit us: www.vibrantmindstech.com | 9503579517
|
Address: 2nd Floor, Viva Building,
Near St. Mary’s Church & Vardhman Petrol Pump, Mumbai- Bangalore Highway,
Warje, Pune 411058
Check our Success Stories: https://vibrantmindssuccessstories.blogspot.com/
Instagram: https://www.instagram.com/vibrantminds_technologies/