GreeleyRecruiter Since 2001
the smart solution for Greeley jobs

Big Data Solutions Architect

Company: Horizontal Talent
Location: Greeley
Posted on: November 26, 2022

Job Description:

DescriptionThe mission for this role is to engineer, build, deploy, and maintain robust big data solutions and platforms for business needs.Responsibilities:

  • Responsible for gathering, maintaining, planning, developing and implementing data solutions and tools to support business units across organization.
  • Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
  • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
  • Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Documents work and processes and apply best-in-class data governance policies.
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
  • Works closely with a team of frontend developers, product and business managers, analysts, and PMO.
  • Designs data integrations and data quality framework.
  • Designs and evaluates open source and vendor tools for data lineage.
  • Works closely with all business units to develop strategy for long term data platform architecture.
  • Manage from start to finish a portfolio of data integration and BI projects, working as the liaison between the business, IT, Digital Transformation teams and third party providers.
  • Design rich data visualizations to communicate complex ideas to leaders.
  • Lead integration of external vendor data, evaluate new solutions, and set process standards.
  • Provides key support for vendor management.
  • Lead the planning and execution of assigned projects.
  • Other duties as assignedQualifications: -
    • BS or MS degree in Computer Science or a related technical field
    • Project management and leadership skills are essential
    • 2+ years of Python development experience
    • 2+ years of SQL experience
    • 2+ years of experience with schema design and dimensional data modeling
    • 2+ years of experience with machine learning
    • Ability in managing and communicating data warehouse plans to internal clients
    • Experience designing, building, and maintaining data processing systems
    • Strong analytical and logical skills, curiosity, organization, attention to details, discipline are crucial for this position.
    • Ability to perform technical deep-dives into code, networking, systems and storage configuration
    • Hands-on experience developing cloud data infrastructure
    • Proficient in Python, SQL, and popular cloud platforms (Microsoft Azure, AWS, or Google - GCP).
    • Technical hands-on skills in machine learning (regression, classification, clustering, dimensionality reduction), deep learning (CNN, RNN/LSTM), time series data, optimization, anomaly detection, statistical algorithms, and data engineering.
    • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc.
    • Good applied statistics skills, such as distributions, statistical testing, regression, etc.
    • Experience with big data technologies such as Hadoop, R, and Java / MapReduce a plus
    • Experience with NoSQL databases, such as MongoDB, Cassandra, HBase, DynamoDB a plus
    • SAP BI HANA knowledge and QlikSense reporting is a plus
    • Working knowledge of relational databases
    • Familiar with microservices architecture
    • Good knowledge of Restful API development, know how to design and develop Restful API
    • Git code management practices (GitHub)
    • Experience in writing unit tests to ensure code quality
    • Current understanding of best practices regarding system security measures
    • Experience in data engineering and design architecture
    • Demonstrated experience with agile development methods. Agile at scale is a plus.
    • A drive to learn and master new technologies and techniques.
    • Experience in agricultural/meat/animal industries is a plus.
    • As a salaried position with the company, you may be required to travel at some point to other facilities, to attend Company events, or as a representative of the Company in other situations. Unless otherwise specified in this posting, the amount of travel may vary and the most qualified candidate must be willing and able to travel as business needs dictate.The applicant who fills this position will be eligible for the following -compensation and benefits:
      • Benefits: Vision, Medical, and Dental coverage begin after 60 days of employment;
      • Paid Time Off: sick leave, vacation, and 6 company observed holidays;
      • -401(k): -eligible after 90 days of employment including company match which begins after the first year of service and follows the company vesting schedule
      • Base salary -of range -$115,000-$146,000 -and
      • Incentive Pay: -This position is eligible to participate in the Company's annual bonus plan, the amount of bonus varies and is subject to the standard terms and conditions of the incentive program;PandoLogic. Keywords: Hadoop Architect, Location: Greeley, CO - 80633

Keywords: Horizontal Talent, Greeley , Big Data Solutions Architect, Other , Greeley, Colorado

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category

Log In or Create An Account

Get the latest Colorado jobs by following @recnetCO on Twitter!

Greeley RSS job feeds