Hadoop Scala Jobs
Rocket Consulting - India, IN (+1 location)
languages such as Java, Scala, or Python, with experience in big data frameworks and tools (e.g., Hadoop, Spark). - Strong understanding of distributed computing principles and concepts, including parallel
from: adzuna.in (+1 source) - 4 days ago
Idyllic Services Pvt Ltd - India, IN
to define and contribute at a high-level to many aspects of our collaborative agile development process. Must Haves : - Hands-on experience with Data Engineering (Spark, Scala, Hadoop and scheduling services
from: adzuna.in - 4 days ago
Target - Bengaluru, Karnataka, India
configuration and deployment along with ability to build custom solutions Have experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc. Have experience with streaming frameworks such as Kafka
from: linkedin.com - Yesterday
TD Newton - Pune, Maharashtra, IN
Scala 2.10 or Java development and design using Java 1.8. - Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL
from: adzuna.in - 4 days ago
CGI Information Systems and Management Consultants - Bangalore, Karnataka, IN
Job Description : - Minimum 4-8 year's experience as a Data Engineer. - Experience with designing & developing production-grade distributed data systems. - Experience with Scala and Hadoop
from: adzuna.in - 4 days ago
Infomatics Corp - Hyderabad, Telangana, India
in a fast-paced, data-driven environment. Proficiency in programming languages such as Python, Java, or Scala, with a strong emphasis on data processing and manipulation. Experience with distributed computing
from: linkedin.com - Yesterday
Pylon Management Consulting - India, IN (+1 location)
. - Have worked on multi-threading it will be helpful. - Experience in working with Kafka will be helpful. - Knowledge of Hadoop Map Reduce, HDFS, Hbase, and Hive will be considered a plus. - Exposure to Dev Ops
from: adzuna.in (+1 source) - 4 days ago
Target Corporation - Bangalore, Karnataka, IN
and differentiating features Working knowledge on package-specific configuration and deployment along with ability to build custom solutions Have experience in building Data Pipelines using Scala, Spark, Hadoop, Hive
from: adzuna.in (+1 source) - 4 days ago
Annalect India - Bengaluru, Karnataka, India (+1 location)
of Linux, Bash, Python, and SQL experience 4+ years using Spark and other Hadoop ecosystem software 3 to 4+ years using AWScloud services Experience on Data Lake and Databricks using PySpark and Scala
from: linkedin.com - Yesterday
IFLOWTECH SOLUTIONS PRIVATE LIMITED - India, IN
Job Description : As part of your duties, you will be responsible for : - Design and implement scalable and robust data pipelines using Scala within the Hadoop ecosystem. - Collaborate with cross
from: adzuna.in - 7 days ago
Get email alerts for: Hadoop Scala Jobs
Sponsored Ads by Indeed