Modernizing and migrating an existing PyQt-based analytics application into a high-performance web application using modern web technologies, such as WebGL, to support mission-critical operations, including satellite data analysis.
Requirements
- Python and experience with data analysis libraries such as NumPy, Pandas, SciPy, and Plotly.
- Database systems, including PostgreSQL, with experience in designing and optimizing database schemas for large-scale data processing.
- Experience with GUI development frameworks like PyQt and basic proficiency in JavaScript for web development.
- Experience with web development technologies, including WebGL, HTML5, CSS, and modern JavaScript frameworks (e.g., React, Vue.js, or Three.js) for building interactive web applications.
- Experience with algorithm development for pattern recognition and trend analysis in time-series data.
- Proficiency in geospatial data processing using tools like PostGIS or GeoPandas, with exposure to web-based geospatial visualization libraries (e.g., Leaflet, Mapbox).
- Familiarity with performance optimization techniques, such as SIMD vectorization or use of libraries like Numba.
Responsibilities
- Lead the migration of a PyQt-based analytics platform to a web-based application using modern web frameworks and technologies like WebGL for enhanced visualization and performance.
- Design and develop Python-based backend systems with integrated database solutions (e.g., PostgreSQL) to process millions of records daily, including large time-series datasets related to satellite operations.
- Develop and optimize algorithms to identify patterns, trends, and anomalies in complex datasets, incorporating basic orbital mechanics principles for spacecraft data analysis.
- Implement geospatial data processing capabilities to support location-based analytics and visualization in a web environment.
- Implement comprehensive logging and data preservation mechanisms to ensure work persistence and enable continuous analysis across operational shifts.
- Modernize legacy systems by replacing fragmented processes with unified, scalable web-based solutions that enhance data quality and operational efficiency.
- Maintain and enhance platform performance, ensuring reliability and scalability in 24/7 operational environments.
Other
- Bachelor’s degree in Mathematics, Computer Science, Data Science, or a related field and 4 years of experience. Graduate Degree and a minimum of 2 years of prior related experience. In lieu of a degree, minimum of 8 years of prior software experience.
- Top Secret or TS/SCI Clearance.
- Experience in software engineering or data analysis, with a focus on processing large datasets.
- Collaborate with analysts and stakeholders to define requirements, ensuring platforms support detailed follow-up work and historical pattern recognition.
- Document system architecture, workflows, and processes using modern tools for clear communication with technical and non-technical audiences.