Organization: UN Global Pulse
Country: United States of America
Closing date: 20 Aug 2015
Vacancy code: VA/2015/B5004/8219 Position title: Data Engineer Duty station: New York , United States of America Contract type International ICA Contract level IICA-2 Duration 4 months (with possibility of extension)
Application period 06-Aug-2015 to 20-Aug-2015
Background Information – Global Pulse
Global Pulse is a United Nations innovation initiative harnessing new digital data to help the public sector understand sooner how communities are coping with the impacts of global crises. To this end, Global Pulse is developing a twenty-first century approach to impact monitoring, building social software for evidence-based decision making, and establishing a network of “Pulse Labs” to mainstream the use of real-time data analysis into development policy. Pulse Labs are national facilities established to support the public sector by developing analytical and technological capacity for harnessing real-time data to strengthen social protection. The overarching objective of Global Pulse is to mainstream the use of data mining and real-time data analytics into development organizations and communities of practice.
The Data Engineer will work in the Global Pulse New York Lab accessing and deriving insights from large data sets to solve problems. He or she must be able to work with a variety of data types, programming languages and platforms, must be comfortable with both hands-on development and research support, and must be adept at both understanding the information content of data and transforming and preparing it for analytical and reporting purposes.
Functional Responsibilities Specify and build prototype analytics platforms based on complex and streaming datasets; Participate in development and optimization of program effectiveness enhancing analytics; Develop and maintain analytics and information reporting applications; Collaborate on data architecture and design decisions; Work together with data science and domain specific expert teams to define software specifications; Develop and use software tools to extract, process and analyze data sets from a variety of sources; Develop tools and algorithms that can scale to very large data sets; Develop content classification systems; Develop new ways to understand and visualize data; Set up and manage infrastructure capable of very large data set ingestion, storage and processing; Plan, oversee and optimize used resource costs.
Competencies
Planning and Organizing – Experience with project management. Develops clear goals that are consistent with agreed strategies; identifies priority activities and assignments; adjusts priorities as required; allocates appropriate amount of time and resources for completing work; foresees risks and allows for contingencies when planning; monitors and adjusts plans and actions as necessary; uses time efficiently. Demonstrable ability to meet deadlines, work under pressure, manage workflows and operate as part of a dispersed team with members across various time zones. Good organizational skills, with timely and detail-oriented implementation of tasks.
Communication – Speaks and writes clearly and effectively; listens to others, correctly interprets messages from others and responds appropriately; asks questions to clarify, and exhibits interest in having two-way communication; tailors language, tone, style and format to match audience; demonstrates openness in sharing information and keeping people informed.
Teamwork – Works collaboratively with colleagues to achieve organizational goals; solicits input by genuinely valuing others’ ideas and expertise; is willing to learn from others; places team agenda before personal agenda; supports and acts in accordance with final group decision, even when such decisions may not entirely reflect own position; shares credit for team accomplishments and accepts joint responsibility for team shortcomings.
Knowledge Management and Learning – Shares knowledge and experience; actively works towards continuing personal learning, acts on learning plan and applies newly acquired skills.
Leadership and Self-Management – Focuses on result for the client and responds positively to feedback; Consistently approaches work with energy and a positive, constructive attitude; remains calm, in control and good humored even under pressure.
Professionalism – Demonstrable ability to work independently with minimal supervision, but also to function effectively as part of a team. Comfortable working with diverse programming languages, design patterns, frameworks, libraries and platforms. Flexible, adaptable, and comfortable working in a start-up-like environment. Ability to work directly with internal and external clients to define analysis and reporting requirements. Passion for Big Data, computers, software engineering, architecture design, and application development.
Education/Experience/Language requirements
Education:
Master’s degree in Computer Science or related field. A combination of a first degree and an additional two years of relevant experience may be accepted in lieu of an advanced degree.
Experience:
- Applicant must have a minimum of 5 years of relevant progressively responsible experience in stream computing and data warehousing.
- Must have worked with *NIX, OSX and Windows systems.
- Experience working with languages commonly used in classic and big data processing (Python, Java, Scala, SQL, …) and presentation layer (Javascript, HTML, CSS, …) is required.
- Experience working with open source data processing frameworks and libraries is required.
- Experience in parallel processing is required.
- Exposure of working with Amazon Web Services, Hadoop and related technologies is an asset
Language:
Fluent written and oral English required. Knowledge of any other UN language an asset.
How to apply:
To view the full vacancy and apply online, visit the UNOPS Global Personnel Recruitment System at: https://gprs.unops.org/pages/viewvacancy/VADetails.aspx?id=8219
EmoticonEmoticon