We are looking for an experienced Developer with expertise in Java for building crawlers or
security automation tools and [Link] for user interface (UI) development. One will be required
to work on both backend automation and frontend development, playing a critical role in
enhancing our cybersecurity platform.
Job Responsibilities:
● Java Development:
○ Design and develop crawlers or other security automation tools using Java.
○ Write efficient and scalable code to perform automated security tasks like
scanning, data extraction, and threat detection.
○ Integrate crawlers and automation tools with backend services and APIs.
○ Optimize automation tools for performance, accuracy, and reliability in different
environments.
● [Link] Development:
○ Build and enhance user interfaces for internal and external tools using [Link].
○ Collaborate with UI/UX designers and backend developers to create responsive
and user-friendly web applications.
○ Ensure smooth integration of frontend with backend services, including data
visualization and reporting tools.
○ Write clean, maintainable, and scalable code that adheres to best practices.
○ Participate in code reviews and improve code quality through feedback and
testing.
Qualifications:
● 5+ years of experience in Java development, particularly in building automation tools,
crawlers, or similar backend processes.
● Strong experience in [Link] for building modern, responsive UIs.
● Proficiency in JavaScript, HTML5, and CSS3.
● Experience with building and integrating RESTful APIs.
● Familiarity with version control systems (e.g., Git) and Agile methodologies.
● Strong problem-solving skills, attention to detail, and the ability to work independently.
● Experience with multi-threading, performance optimization, and handling large-scale
data in crawlers or automation tools.
Preferred Qualifications:
● Familiarity with security-related tools, penetration testing, or vulnerability management.
● Experience with database technologies (e.g., MySQL, MongoDB) to store and manage
data extracted by crawlers.
● Understanding of testing frameworks like Jest, JUnit, or similar.
● Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and deploying automation tools
in distributed environments.