Fannie Mae is looking to hire a Sr. Software Engineer - Enterprise Data to design, produce, test, or implement moderately complex software, technology, or processes, as well as create and maintain IT architecture, large scale data stores, and cloud-based systems to expand access to homeownership and affordable rental housing across the country.
Requirements
- 2+ years with Big Data Hadoop cluster (HDFS, Hive, MapReduce frameworks), Spark, AWS EMR, ECS
- 2+ years recent experience building and deploying applications in AWS (S3, Hive, Glue, AWS Batch, Dynamo DB, Redshift, AWS EMR, Cloudwatch, RDS, Lambda, SNS, SWS etc.)
- 2+ years Python, SQL, SparkSQL, PySpark
- Skilled in cloud technologies and cloud computing
- Programming including coding, debugging, and using relevant programming languages
- Skilled in creating and managing databases with the use of relevant software such as MySQL, Hadoop, or MongoDB
- Skilled in discovering patterns in large data sets with the use of relevant software such as Oracle Data Mining or Informatica
Responsibilities
- Independently determine the needs of the customer and create solution frameworks.
- Design and develop moderately complex software solutions to meet needs.
- Use a process-driven approach in designing and developing solutions.
- Implement new software technology and coordinate end-to-end tasks across the team.
- May maintain or oversee the maintenance of existing software.
- create and maintain IT architecture, large scale data stores, and cloud-based systems.
Other
- Bachelor degree or equivalent
- 4+ years experience with Big Data Hadoop clusters, AWS, and Python
- Knowledge of Spark streaming technologies
- Experience in working with agile development teams
- Familiarity with Hadoop / Spark information architecture, Data Modeling, Machine Learning (ML)