Kafka Administrator

Job Locations IN-Pune
Requisition ID
Job Category
Information Technology
Travel Requirements

As a Data Platform Administrator for Streaming and Distributed Data Platforms in SAS Cloud, you will be responsible for operations of our hosted data platform environment. You will work collaboratively with support, analysts, developers and other implementation team members to maintain operational integrity of our data platform installations and to manage, diagnose, debug, and maintain SAS and third-party enterprise class software systems. You will support SAS Cloud with expertise, mentoring and best practices with respect to data platform technology and architecture.

Primary Responsibilities:


  • Operational management and monitoring of various Data Platform systems.
  • Perform upgrades, scripting, task automation, backups/recovery
  • Creating and maintaining engineering documents and system designs
  • Maintain appropriate written documentation for operational procedures and system design
  • Performance tuning of the systems in SAS Cloud.
  • Experience with monitoring systems and diagnostic tools
  • Adherence and enforcement of security standards and policies
  • Ability to learn other data platform technologies as supported by SAS Cloud Data Platform
  • Multi-tasking in a hosting and production support driven environment

Additional Responsibilities:


  • Serve as technical escalation and support other team members.
  • Contribute to overall service quality by identifying and /or improve the customer experience with SAS Cloud services.
  • Shift working as per business demand.
  • Participate in a 24x7x365 on-call rotation.

What we’re looking for:


  • 2+ years of experience in any of the following:

Cloudera Hadoop, Kafka, Snowflake, or SingleStore

  • Experience with Hadoop technologies such as HDFS, Pig, Hive, Spark, Presto, Impala and management of the Hadoop system
  • Experience with Java in application of MapReduce and Spark
  • Experience with Microsoft Azure
  • Experience with PERL, Python or other shell/batch scripting languages
  • Proficiency in Unix, Linux and Windows Operating Systems

Additional Skills and Abilities:

  • Excellent written and verbal communication skills
  • Excellent interpersonal and problem-solving skills, as well as strong decision-making skills
  • Ability to handle a fast-paced environment and possess the ability to prioritize multiple tasks in a team setting
  • Ability to interact well with development groups and end users
  • Ability to diagnose and resolve problems in a timely fashion
  • Ability to work flexible business hours as required by global customer / business needs.
  • Able to meet the associated challenges of different cultures, work practices and time zones.



  • Experience with Kafka and other streaming facets in Hadoop
  • Experience with Elasticsearch
  • Experience with Kibana
  • Experience with security and encryption in Hadoop
  • DBA experience more than one of the following: Oracle, MS SQL Server, MySQL, PostgreSQL Based RDBMS Systems.
  • Experience with Amazon Web Services (AWS) including RDS, Redshift, EMR, S3, etc.
  • Experience with Azure Services including HDInsight, Databricks, IoT Hub, Event Grid, ADLS Gen2, etc.
  • Experience with Java development
  • Experience with SAS software


  • Equivalent combination of education, training, and relevant experience may be considered in place of the requirements stated above.
  • The level of this position will be determined based on the applicant's education, skills and experience.


Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed