Job Description Technical
5+ years hands-on experience with the AWS services like S3, EC2, AWS Glue, Kinesis, DMS, RedShift
5+ years hands-on experience with related / complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl / Python / PHP, Chef, Scala)
5+ years of experience working on cloud platforms - Public Cloud AWS / Azure / Google Cloud
Hands-on experience with ETL (Extract-Transform-Load) tools (e.g. Informatica, Talend, Pentaho)
Knowledge / Hands-on experience with BI tools and reporting software (e.g. Tableau)
Hands-on experience with analytical tools, languages, or libraries (e.g. Sagemaker, Tensorflow)
Hands-on experience with "productionalizing " platform services and applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
Hadoop platforms & distributions : EMR or other similar products
Previous experience with high-scale or distributed RDBMS (SQL Server, DB2, Oracle)
Proficient understanding of Underlying infrastructure and cloud foundation required for data lake
Strong understanding across Cloud and infrastructure components (server, storage, network, data, and applications) to deliver end to end Cloud data architectures and designs.
Track record of thought leadership and innovation around Data landscape. Solid understanding of Cloud Computing Technologies and related emerging technology (e.
g. Amazon Web Services EC2, Elastic MapReduce, Azure, GCP) and considerations for scalable, distributed systems Knowledge of NoSQL platforms (e.
g. key-value stores, graph databases, RDF triple stores) Japanese and English should be near business level