Engineering Jobs | Data Engineer | Jobs Alert | General Mills | Mumbai | Latest Job 2022
We exist to make food the world loves. Yet, we accomplish more than that. General Mills is a spot that focuses on being a power for great, a spot to extend learning, investigate new viewpoints and rethink additional opportunities, consistently. We search for individuals who need to bring their best—strong masterminds with large hearts who challenge one other and become together. Since turning into the undisputed forerunner in food implies encircle ourselves with individuals who are ravenous for what’s next.
Job Description | Engineering Jobs
This group is in charge of designing and architecting solutions to integrate and transform business data into the Data Lake in order to deliver a data layer for the Enterprise using cutting-edge technologies such as Big Data – Hadoop. We provide solutions to address the growing demand for more internal/external data to be combined with existing sources; we research, develop, and exploit new technologies to provide the company with more actionable insights.
To provide end-to-end solutions for the business, we integrate solutions that mix process, technological landscapes, and business information from the primary enterprise data sources that make up our corporate information factory.
This job will work on Enterprise Data Lake and Data Warehouse solutions. You’ll be in charge of creating business intelligence and data mining data lake solutions.
Roles and Responsibilities | Engineering Jobs
- 70% of time Create, code, and support a variety of Hadoop, ETL & SQL solutions
- Experience with agile techniques or methods
- Work effectively in a distributed global team environment.
- Works on pipelines of moderate scope & complexity
- Effective technical & business communication with good influencing skills
- Analyze existing processes and user development requirements to ensure maximum efficiency
- Participates in the implementation and deployment of emerging tools and processes in the big data space
- Turn information into insight by consulting with architects, solution managers, and analysts to understand the business needs & deliver solutions
- 20% of time Support existing Data warehouses & related jobs.
- Job Scheduling experience (Tidal, Airflow, Linux)
- 10% of time Proactive research into up to date technology or techniques for development
- Should have automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes.
Eligibility | Engineering Jobs
- Minimum Degree Requirements: Bachelors
- Preferred Degree Requirements: Bachelors
- Preferred Major Area of Study: Engineering
- Minimum years of Hadoop experience required: 2 years
- Preferred years of Data Lake/Data warehouse experience: 2-4+ years
- Total Experience required : 4-5 years
- Skills Level: Beginner Intermediate Expert Advance
- HDFS, Map reduce
- Hive, Impala & Kudu
- SQL, PLSQL
- Data Warehousing Concepts
- Other Competencies
- Demonstrate learning agility & inquisitiveness towards latest technology
- Seeks to learn new skills via experienced team members, documented processes, and formal training
- Ability to deliver projects with minimal supervision
- Delivers assigned work within given parameter of time and quality
- Self-motivated team player and should have ability to overcome challenges and achieve desired results
Аррly Link is given belоw jоin us fоr Reсent Uрdаte