1+ months

Data Engineer

Discovery Communications
Sterling, VA 20166
Apply Now
Apply on the Company Site
  • Job Code
    108654074

Discovery Communications


Requisition ID 22429
{}Career Category IT & Technical Operations
Posted Date 2019-03-20

Location US-VA-Sterling
Type Company Employee Full-Time

Position Summary:


Our Team

As Discovery Inc's portfolio continues to grow - around the world and across platforms - the Global Technology & Operations team is building media technology and IT systems that meet the world class standard for which Discovery is known. GT&O builds, implements and maintains the business systems and technology that are critical for delivering Discovery's products, while articulating the long-term technology strategy that will enable Discovery's growing pay-TV, digital terrestrial, free-to-air and online services to reach more audiences on more platforms.

From Amsterdam to Singapore and from satellite and broadcast operations to SAP, we are driving Discovery forward on the leading edge of technology.

The Global Data Analytics team enables Discovery to turn data into action. Using big data platforms, data warehousing and business intelligence technology, audience data, advanced analytics, data science, visualization, and self-service analytics, this team supports company efforts to increase revenue, drive ratings, and enhance consumer engagement.

The Role

The Data Engineer should be a technical contributor who has hands-on knowledge of all phases in building large-scale cloud based distributed data processing systems and applications. You will be part of the Global Data & Analytics engineering technology team and will partner closely with a team of data scientists, business analysts & data engineers leading Discovery's cloud based Big Data & Analytics strategy.
You'll work on implementing complex AWS based big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple technology platforms. Therefore, this role requires an understanding of how a secure big data cloud environment is architected to gain real insights faster, with less friction and complexity. The data engineer should be passionate about working with cutting edge technologies in solving problems and developing prototypes using different open source tools for the selected solutions.

You'll need to be an innovative forward-thinker who will help lead end-to-end execution of data engineering initiatives and contribute directly to existing and emerging business strategies and goals. Creativity, Attention to detail and ability to work in a collaborative team environment are essential.

The Data Engineer will work closely with the Data Engineering Director to decide on needed infrastructure architecture and software design needs and act according to the decisions.



Responsibilities:


1. Lead the design, implementation, and continuous delivery of pipelines using distributed AWS based big data technologies supporting data processing initiatives across batch and streaming datasets
2. Responsible for development using Scala , Python languages and Big Data Frameworks such as Spark, EMR, Presto, AWS Athena, Kafka, Zepplin , and Kinesis
3. Provide administrative support on deployed AWS platform components
4. Identify, evaluate and implement cutting edge big data pipelines and frameworks required to provide requested capabilities to integrate external data sources and APIs
5. Review, analyse and evaluate market requirements, business requirements and project briefs in order to design the most appropriate end-to-end technology solutions
6. Process and manage high volume real time customer interaction streams
7. Provide architectural support by building Proof of Concepts & Prototypes
8. Self-Starter to deliver data engineering solutions to optimize both the cost and existing solution
9. Stay current with emerging technologies and industry trends



Requirements:


* Bachelor's Degree or higher in Computer Sciences or similar
* Minimum of 5-6 years Software Industry experience
* 3+ years of development experience with AWS services Must have EC2, EMR , RedShift, Data Pipeline or Airflow, S3, Cloud Formation and CLI (must to have ) and Jenkins
* 3+ years of development experience with Apache Spark, Presto, SQL, notebook and NoSQL Implementation
* 4+ years of extensive working knowledge in different programming Scala ( Must ), Shell and Python (Must).
* Proficiency working with structured, semi-structured and unstructured data sets including social, web logs and real time streaming data feeds
* Able to tune Big Data solutions to improve performance and end-user experience
* Knowledge on Visualization and Data Science Tools.
* Expert level usage with Jenkins, GitHub is preferred
* Spark developer certification is a plus
* Ability and eagerness to constantly learn and teach others
* Experience in the media industry is a plus

* Must have the legal right to work in the United States

Sterling, Virginia, VA



Discovery Communications, Inc. is an equal opportunity employer. Discovery is committed to being an employer of choice, not just a good place to work, but a great and inclusive place to work. To that end, we strive to recruit and maintain a workforce that meaningfully represents the diverse and culturally rich communities that we serve. Qualified applicants will receive consideration for employment without regard to their race, color, religion, national origin, sex, sexual orientation, gender identity, protected veteran status or disabled status or, genetic information.

EEO is the Law

Pay Transparency Policy Statement

If you are an individual with a disability and need an accommodation during the application process, please send an email request to [email protected] <br><b>Discovery Communications</b><br><br><br><b>Requisition ID</b> 22429<br><b>{}Career Category</b> IT & Technical Operations<br><b>Posted Date</b> 2019-03-20<br><br><b>Location</b> US-VA-Sterling<br><b>Type</b> Company Employee Full-Time<br><br><b>Position Summary:</b><br><br><br><strong>Our Team</strong><br><br>As Discovery Inc's portfolio continues to grow - around the world and across platforms - the Global Technology & Operations team is building media technology and IT systems that meet the world class standard for which Discovery is known. GT&O builds, implements and maintains the business systems and technology that are critical for delivering Discovery's products, while articulating the long-term technology strategy that will enable Discovery's growing pay-TV, digital terrestrial, free-to-air and online services to reach more audiences on more platforms.<br><br>From Amsterdam to Singapore and from satellite and broadcast operations to SAP, we are driving Discovery forward on the leading edge of technology.<br><br>The Global Data Analytics team enables Discovery to turn data into action. Using big data platforms, data warehousing and business intelligence technology, audience data, advanced analytics, data science, visualization, and self-service analytics, this team supports company efforts to increase revenue, drive ratings, and enhance consumer engagement.<br><br><strong>The Role</strong><br><br>The Data Engineer should be a technical contributor who has hands-on knowledge of all phases in building large-scale cloud based distributed data processing systems and applications. You will be part of the Global Data & Analytics engineering technology team and will partner closely with a team of data scientists, business analysts & data engineers leading Discovery's cloud based Big Data & Analytics strategy.<br>You'll work on implementing complex AWS based big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple technology platforms. Therefore, this role requires an understanding of how a secure big data cloud environment is architected to gain real insights faster, with less friction and complexity. The data engineer should be passionate about working with cutting edge technologies in solving problems and developing prototypes using different open source tools for the selected solutions.<br><br>You'll need to be an innovative forward-thinker who will help lead end-to-end execution of data engineering initiatives and contribute directly to existing and emerging business strategies and goals. Creativity, Attention to detail and ability to work in a collaborative team environment are essential.<br><br>The Data Engineer will work closely with the Data Engineering Director to decide on needed infrastructure architecture and software design needs and act according to the decisions. <br><br><br><br><b>Responsibilities:</b><br><br><br>1. Lead the design, implementation, and continuous delivery of pipelines using distributed AWS based big data technologies supporting data processing initiatives across batch and streaming datasets<br>2. Responsible for development using Scala , Python languages and Big Data Frameworks such as Spark, EMR, Presto, AWS Athena, Kafka, Zepplin , and Kinesis<br>3. Provide administrative support on deployed AWS platform components<br>4. Identify, evaluate and implement cutting edge big data pipelines and frameworks required to provide requested capabilities to integrate external data sources and APIs<br>5. Review, analyse and evaluate market requirements, business requirements and project briefs in order to design the most appropriate end-to-end technology solutions<br>6. Process and manage high volume real time customer interaction streams <br>7. Provide architectural support by building Proof of Concepts & Prototypes<br>8. Self-Starter to deliver data engineering solutions to optimize both the cost and existing solution<br>9. Stay current with emerging technologies and industry trends<br><br><br><br><b>Requirements:</b><br><br><br>* Bachelor's Degree or higher in Computer Sciences or similar <br>* Minimum of 5-6 years Software Industry experience <br>* 3+ years of development experience with AWS services Must have EC2, EMR , RedShift, Data Pipeline or Airflow, S3, Cloud Formation and CLI (must to have ) and Jenkins<br>* 3+ years of development experience with Apache Spark, Presto, SQL, notebook and NoSQL Implementation<br>* 4+ years of extensive working knowledge in different programming Scala ( Must ), Shell and Python (Must).<br>* Proficiency working with structured, semi-structured and unstructured data sets including social, web logs and real time streaming data feeds<br>* Able to tune Big Data solutions to improve performance and end-user experience<br>* Knowledge on Visualization and Data Science Tools.<br>* Expert level usage with Jenkins, GitHub is preferred<br>* Spark developer certification is a plus<br>* Ability and eagerness to constantly learn and teach others<br>* Experience in the media industry is a plus<br><br>* Must have the legal right to work in the United States <br><br>Sterling, Virginia, VA<br><br></p><br><br>Discovery Communications, Inc. is an equal opportunity employer. Discovery is committed to being an employer of choice, not just a good place to work, but a great and inclusive place to work. To that end, we strive to recruit and maintain a workforce that meaningfully represents the diverse and culturally rich communities that we serve. Qualified applicants will receive consideration for employment without regard to their race, color, religion, national origin, sex, sexual orientation, gender identity, protected veteran status or disabled status or, genetic information. <br> <br><a href="https://discovery.icims.com/icims2/servlet/icims2?module=AppInert&action=download&id=581131&hashed=1619517695">EEO is the Law</a><br> <br><a href="https://discovery.icims.com/icims2/servlet/icims2?module=AppInert&action=download&id=588767&hashed=2014725565">Pay Transparency Policy Statement</a><br> <br>If you are an individual with a disability and need an accommodation during the application process, please send an email request to <a href="mailto:[email protected]">[email protected]</a> <br><br> <img src="https://analytics.click2apply.net/v/k5WqJptK1qQOF7A8iObq1"> <br/><br/><br><br>* Bachelor's Degree or higher in Computer Sciences or similar <br>* Minimum of 5-6 years Software Industry experience <br>* 3+ years of development experience with AWS services Must have EC2, EMR , RedShift, Data Pipeline or Airflow, S3, Cloud Formation and CLI (must to have ) and Jenkins<br>* 3+ years of development experience with Apache Spark, Presto, SQL, notebook and NoSQL Implementation<br>* 4+ years of extensive working knowledge in different programming Scala ( Must ), Shell and Python (Must).<br>* Proficiency working with structured, semi-structured and unstructured data sets including social, web logs and real time streaming data feeds<br>* Able to tune Big Data solutions to improve performance and end-user experience<br>* Knowledge on Visualization and Data Science Tools.<br>* Expert level usage with Jenkins, GitHub is preferred<br>* Spark developer certification is a plus<br>* Ability and eagerness to constantly learn and teach others<br>* Experience in the media industry is a plus<br><br>* Must have the legal right to work in the United States <br><br>Sterling, Virginia, VA<br><br>
Posted: 2019-03-21 Expires: 2019-06-09

Before you go...

Our free job seeker tools include alerts for new jobs, saving your favorites, optimized job matching, and more! Just enter your email below.

Share this job:

Data Engineer

Discovery Communications
Sterling, VA 20166

Join us to start saving your Favorite Jobs!

Sign In Create Account
Powered ByCareerCast