Data Architect | Regular Shift
Hunter's Hub Inc.
- Makati, PhilippinesUnit 1902, Cityland 10 Tower 2, 154 H.V Dela Costa St. Corner Valero, Bel-air, Makati City, Makati, Metro Manila, PhilippinesMakatiMetro ManilaPhilippinesPhilippines
- Full timeFULL_TIME
Job Description
The Big Data Solutions Architecture addresses specific big data problems and requirements. The Big data solutions architects will describe the structure and behavior of a big data solution and how that big data solution can be delivered using big data technology such as Hadoop. They must have hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning). The big data solutions architect is responsible for managing the full life-cycle of a Hadoop solution. This includes creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution.
Roles and Responsibilities:
- Familiar of key concepts on distributed processing for Big data technologies and other related technologies on the field.
- Experience of leading a team or perform a Proof of Concept (POC) and produce results and/or recommendations on how to use the technology in conjunction with the current applications.
- Proven ability to find integration points with new technology and existing legacy technology.
- Experience in designing solutions with fully redundant systems and DR processes / failovers
- Ability to draft end to end solutions including logical and physical implementation designs
- Have the flexibility to deploy cloud solutions; on-prem or on-cloud or in some cases, hybrid architecture or also virtualized implementations
- Make suggestions on how to improve existing processes and reduce risks through redesigns
- Explain how business requirements are satisfied by the proposed technological solution and explain this to the customers a/ main benefactor of the proposed solutions
- Experiencing evaluating vendor proposal and give recommendations on the best approach to take
Minimum Qualifications
- Key Technology Requirements:
- Familiar with Data Warehousing concepts as EDS is mandated to perform key data warehousing functions
- Knowledgeable and proficient with existing legacy technologies and their integration points
- o Databases (Oracle, Teradata, Vertica etc.)
- o Reporting Technologies (PowerBI, Tableau, Cognos etc.)
- o ETL Technologies (Talend, Informatica, etc.)
- Experience with a cloud technology (AWS, Google Cloud, Microsoft Azure)
- Familiar with virtualization technologies (VMWare, Docker, Kubernetes)
- Familiar with both batch and streaming processing technologies (Spark, MapReduce, Kafka, Flink)
- In depth experience with API integrations
- Knowledgeable about overall system security such as Kerberos and other security layer and tools (Ex. Cloud will have its own security implementations)
- Familiar on redundancy software (Keep Alive, Load Balancer, HA Proxy etc.)
- Proficient in Big data ecosystem and surrounding technologies (HDFS, YARN, Cloudera etc.)
- Knowledgeable on machine learning platforms is a plus (how ML process works)
- Firm understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Python and/or R.
- 5 to 8 years’ experience in a similar role
- Qualifications and Accreditations:
- Amazon Web Services (AWS) Certified Data Analytics – Specialty
- Cloudera Certified Associate (CCA) Spark and Hadoop Developer
Jobs Summary
- Job Level
- Associate / Supervisor
- Job Category
- IT and Software
- Educational Requirement
- Bachelor's degree graduate
- Office Address
- Cityland 10 Tower 2, 154 H.V Dela Costa St. Corner Valero, Bel-air, Makati City