At Rogers, we recognize that success is determined by the strength and diversity of our people. We work together because we want to win together, and these e shared values guide and define our work:
:Simplify and innovate
:Take ownership of the what and the how
:Equip people to succeed
:Execute with discipline and pride
:Talk straight, build trust, and over deliver
Every day, we strive to build a brilliant digital future for Canadians. We work as one team, with one goal :serve our customers better.
Rogers is seeking a Senior Manager, Big Data Innovation to manage Big Data Hadoop Development team and evolve Rogers Enterprise Big Data clusters, applications and services.
Reporting to the Director of Big Data Innovation at the Rogers Brampton Campus this role will be acting as a Development Manager for a quickly growing team of Big Data Hadoop Application Developers focusing on deploying a wide range of Big Data applications and services for Business, IT, Network and external clients.
:Lead the initiative in developing and executing an ongoing strategy for Big Data enterprise wide, while continuously evolving the platform, ytical and data management tools
:Partner with Business and IT stakeholders to evolve, enhance and refine Big Data use cases and applications based on clearly defined business drivers and benefits
:Collaborate with various Business and IT groups to identify and develop interface points with internal and external 3rd party platforms and data sources
:Lead cross:functional teams to fully implement Big Data solutions by adopting iterative, agile development, automated testing, quality assurance and release management methodologies/best practices/standards
:Lead Big Data design and development team in delivering scalable and reliable Big Data solutions leveraging Hadoop Data platform, traditional RDBMS, BI tools, SaaS platforms and APIs
:Manage cost estimates including hardware, software, labour, licenses
:Manage team of highly motivated individuals that deliver on strategic initiatives, and provide ongoing coaching on both current and future technology
:Ensure detailed do entation is maintained as required by SDLC
:Set standards and practices on deployment, optimization and tuning of Hadoop development and framework
:A degree in Computer Science, Data Science, Software Engineering, Information Technology or a related field
:Hadoop 2.0 Developer, Administrator or an equivalent certification
:3+ years of production experience managing cross:functional project teams : clients, developers, solutions design, production support, 3rd parties and vendors
:1+ years of Production experience managing Big Data environments and development teams
:1+ years of production experience managing Hadoop application development, installations, performance tuning, configuration, and optimization
:1+ years of production experience deploying Hadoop core components and services: HDFS, YARN, Hive, Pig, HBase, Spark, Sqoop, Falcon, Oozie, etc.
:Strong background in Business Intelligence and production experience with Data warehousing projects and programs
:Production experience and excellent understanding of Data Warehousing lifecycle principles and best practices
:Production experience and excellent understanding of Software Development Lifecycle principles and best practices
:Production experience and in:depth understanding of Hadoop Big Data ecosystem and best practices (Hortonworks is preferred)
:Experience working with virtualized and physical environments, including public and private cloud
:Experience in Telecommunication industry would be an asset
:Experience with ytics, Predictive Modeling, Machine learning applications would be an asset
:Strong ytical and problem solving skills
:Highly motivated and very proactive individual, dedicated to follow:up/follow through without reliance on management for direction