Apply Now    

Informatica/Big Data SME (REMOTE POSITION)

Req #: 132104
Location: North Charleston, SC US
Job Category: Information Technology
Security Clearance: Secret
Clearance Status: Must Be Current

Job Description

Informatica / Big Data SME:
 
Job Description
- Provide technical and development support to the government client to build and maintain a modernized Enterprise Data Warehouse (EDW) Informatica ETL Developer who is responsible for developing, maintaining, and supporting the Data Warehouse / Data Mart environment

- Perform data formatting involves cleaning up the data.

- Responsible for devising data conversion strategy for a system focused on delivering an advanced analytical capability to the client.

- Creation of new Informatica scripts, transition of PL/SQL ETLs to new Informatica scripts, or modification of existing scripts. Be involved in developing Informatica Mappings and writing complex Oracle PL/SQL programs for the Data Warehouse. Other tasks involve analyzing transaction errors, troubleshooting issues in the software, developing bug-fixes, involving in performance tuning efforts. Assess the suitability and quality of candidate data sets for the Data Lake and the EDW

- Design and prepare technical specifications and guidelines.

- Act as self-starter with the ability to take on complex projects and analyses independently
 
Requirements:
A minimum of 5 years of development and software training and administration within a data warehouse environment. Tools utilized Informatica 9.6 Power Center, Informatica Big Data Management 10.1, Data Profiler, Teradaat  Enterprise Data Warehouse (EDW), Teradata Studion, Hadoop, SAS, Tableau, Aster, R.
Ability to pick new tools easily*, teachability*
Data Integration experience, including strong time management skills, ability to design, document, develop and test data integration process from analyst specifications
General Data Management knowledge-mainly data lineage and metadata management
General knowledge of Hadoop/HDFS/Hive
Experience with relational databases, such as SQL Server, Oracle and ability to write SQL commands is required
Provide technical support to the government client to build and maintain a modernized Enterprise Data Warehouse (EDW) by expanding the current on-premises Hadoop cluster to accommodate an increased volume of data flowing into the enterprise data warehouse.
Knowledge to Perform data formatting involves cleaning up the data, Assign schemas and create HIVE tables
Apply formats and structure to support fast retrieval of data, user analytics and analysis
Assess the suitability and quality of candidate data sets for the Data Lake and the EDW
Design and prepare technical specifications and guidelines.
Act as self-starter with the ability to take on complex projects and analyses independently
Knowledge of SLES 11 OS.
DOD Secret clearance and 8570 certification required.
 
Working knowledge of linux
Working knowledge of java and eclipse based products (infrastructure product would be best, but prior experience supporting any product would help)
Working knowledge of windows servers (support of developer tool)
Ability to learn new skills quickly
Nice to have would be Informatica exposure
 
Preferred:
General knowledge of Teradata processing 
Knowledge of ELT in Teradata (PDO, optimal processes)
Practical skills with efficient file movement inside of Hadoop
Linux command line experience
Informatica BDM architecture knowledge
Scripting experience (Unix/Python/bash,bat)
 
 
Nice to have:
Experience with JSON, XML, EDI
Informatica Developer use
DBH input
 
Qualifications:

- Bachelors or Master's Degree in technical or business discipline or related experience required. BS in Computer Science, Math or Engineering desired

- Experience in data integration development with 5 years' experience as a lead developer preferred

- Requires excellent analytical ability, consultative, communication, presentation and management skills, strong judgment and ability to work effectively with clients, IT management and staff.

- Experience with relational databases, such as SQL Server, Oracle and ability to write SQL commands is required

- Requires strong negotiation, facilitation and consensus building skills; In-depth knowledge of project planning methodologies and tools and IT standards and guidelines; knowledge of management concepts, practices and techniques.

- Ability to develop processes that utilize reusable objects, repeatable design patterns and clear coding following project's data integration standards and guidelines

- In-depth knowledge of IT concepts, strategies, and methodologies; thorough knowledge of business functions and operations, objectives and strategies.

- Prior use of Informatica Developer transformations

- Ability to pick new tools easily*, teachability*

- General Data Management knowledge-mainly data lineage and metadata management

- General knowledge of Hadoop/HDFS/Hive
 
Preferred:

- Experience working in an agile environment (Scrum, Kanban, XP, etc.)

- Experience working & managing modern data technologies including Hadoop, AWS, Informatica Big Data Management

- Prior experience in development of real-time or near real-time data integration processes

- Programming background with Unix/Python/bash, bat is strongly preferred.

- Ability to develop Big Data architecture , experience of contributing and influencing change

- Experience working in a similar environment (i.e. DOD / DHA services)

- General knowledge of Teradata processing 

- Knowledge of ELT in Teradata (PDO, optimal processes)

- Practical skills with efficient file movement inside of Hadoop

- Linux command line experience

- Informatica BDM architecture knowledge

- Scripting experience (Unix/Python/bash,bat)

- Experience with JSON, XML, EDI

- Full phase implementation experience with Informatica ETL tool,  Data warehouse and ETL design, build, test experience  including requirements assessment, ownership of data integration architecture and ETL scheduling using Informatica
- Demonstrated strength and current experience of using Informatica on Teradata and / or Hadoop
Delivery:
- Experience within a delivery role
- Experience with client facing consulting engagements
- Extensive experience within management consulting or consulting services
 
Soft Skills:
- Excellent analytical and problem solving skills
- Excellent verbal and written communication skills
- Mentors team members in technology, architecture and delivery of applications
- Creates a shared sense of direction and community among the team
- Proven ability to transfer knowledge and stay aware of current trends and technical advancements
- Ability to articulate and present different points-of-views on various technologies
- Time management skills are a must; as well as the ability to be flexible and creative
- A strong track record of professional success, preferably in the Consulting Services arena

*Work can be performed remotely.*

EDUCATION & EXPERIENCE:
Typically requires bachelor's degree and five to seven years of related experience.
 
PHYSICAL DEMANDS:
Normal demands associated with an office environment. Ability to work on computer for long periods, and communicate with individuals by telephone, email and face to face. Some travel may be required.

Job Location

US-North Charleston-SC-CHARLESTON


 

CACI employs a diverse range of talent to create an environment that fuels innovation and fosters continuous improvement and success. At CACI, you will have the opportunity to make an immediate impact by providing information solutions and services in support of national security missions and government transformation for Intelligence, Defense, and Federal Civilian customers. CACI is proud to provide dynamic careers for employees worldwide. CACI is an Equal Opportunity Employer - Females/Minorities/Protected Veterans/Individuals with Disabilities.

Apply Now    
Link for schema