Smartronix, Inc. > Careers
– Back to **************
– Employment Listings
– Search Current Openings
– Update Your Profile
Your user session will timeout in 5 minutes. Please click OK to continue with your application.
Solutions Architect – AWS Big Data Developer
Tracking Code 2578-749
Smartronix Inc. a Premier Amazon Web Services Consulting Partner and Microsoft Gold partner is currently seeking a Big Data Solutions Architect with a strong Development. The candidate will bring a passion for technology and an ability to understand, assess and implement cutting edge solutions. We are seeking a driven individual who has a desire to work with some of the best Cloud Architects in the Market. Today Smartronix serves Fortune 1000 Financial, Healthcare, and government agencies and other regulated markets worldwide in delivery of Cloud Services.
In this role, the Big Data Solution Architect will directly support and work as a hands-on technical contributor within the Solution Delivery Organization. They will solve complex data centric problems by using analytical thinking and development tools.
They will bring proven industry experience and creative thinking to the role. They will exploit both proven and new techniques – leveraging not only industry proven COTS solutions but cloud native capabilities to the full extent possible.
This is a highly visible position, requiring a candidate to possibly interact directly with executive-level clients. The candidate will work in lock step with project management to navigate complex customer landscapes.
They will be able to look across a variety of data sources to devise proven yet inventive ways of organizing, consolidating, and optimizing how the data is used and what it means.
They will not only bring the key skills of a Big Data and Development specialist, but also those of an analyst, architect, programmer, communicator, operations engineer and trusted adviser. They must be able and willing to collaborate as part of any project team – large or small. They must be receptive to feedback, willing to take on any task, and capable of clearly communicating with their customer’s, management and other key stakeholders in both technical and high level terms. #CJPOST
– Provide hands-on subject matter expertise to build and implement Hadoop-based Big Data solutions
– Research, evaluate, architect, and deploy new tools, frameworks, and patterns to build and develop sustainable Big Data platforms, capabilities and integrations for our clients
– Design and implement complex, highly scalable, data-centric and data-driven solutions that comply with security requirements
– Identify gaps and opportunities for the improvement of existing client solutions
– Collaborate with internal team members to propose and delivery technical solutions
– Interact and collaborate directly with client contents including at the executive level
– Define and develop APIs (as well as leverage existing APIs) for integration with various data sources in the enterprise
– Actively collaborate with other architects and developers in developing client solutions
– Write technical documentation and standard operating procedures for both internal company usage and customer distribution or deliver
– 4+ years hands on experience in a development-centric role
– Hands-on experience with AWS Native Services provisioning, installation, and configuration
– Hands-on experience designing, developing, and administering solutions that utilize EMR, Redshift, S3, EC2, ELB, EBS and/or other native services on static and/or ephemeral AWS platforms
– Hands-on experience with AWS EMR with Spark, RDS, RedShift, Hive Meta-store configuration
– Emphasis on the design, development, and implementation of engineering solutions to support efficient and automated onboarding, performance monitoring, regular maintenance, and off boarding through the AWS CLI and scripting languages that integrate CLI solutions
– Hands-on experience with AWS Developer Tools (e.g.: Code Commit, Code Deploy, Code Pipeline) and AWS APIs or CLI
– Implementation and tuning experience specifically using Amazon Elastic Map Reduce
– Experience with SPARK on EMR
– Experience with RDS(postgres)
– Good understanding of HIVE meta store configuration
– Exposure on optimizing RedShift Cluster
– Excellent oral and written communications
– AWS cloud formation constructs and implementation
– Performance tuning of Spark
– Data synchronizing experience between HIVE, RDS and RedShift.
– Experience with programming languages like Shell script, JAVA (Spark), Python, SQL
– Experience with AWS CloudFormation
– Good understanding of Operating Systems (Unix/Linux preferred) including network related configurations, authentication mechanisms, storage configurations
– General understanding of enterprise networks and network topologies
– Ability to engineer or develop cloud-native solutions around the efficient onboarding, monitoring, maintenance and operation of Production workloads
– Ability to develop monitoring and auto-scaling solutions geared at managing and providing transparency to AWS resource usage and costs
– Proven ability to make sound decisions with the customer and best practice in mind
– Proven implementation approach based on long-term strategic objectives, while taking into consideration short-term implications for ongoing or planned implementations
– Ability communicate with both technical and non-technical audiences
– Strong interpersonal skills
– Willingness and ability to collaborate across internal practices and with other team members
– Able to work effectively within a delivery-centric organization, focused on delivering quality solutions
– 4+ years hands on experience in a development-centric role
– BS/MS in Computer Science or related field, additional years of experience may be used in lieu of degree
Job Location Reston, Virginia, United States
Position Type Full-Time/Regular
US Citizenship Required
Clearance Level Required
02. Public Trust
Return To Job Listings
To apply for this job please visit itjobpro.com.