Company Description SourceDay is the leading cloud-based software solution that automates the purchase order management process. By seamlessly integrating with ERP systems, SourceDay extends purchasing capabilities by centralizing and managing the PO lifecycle for buyers and suppliers, eliminating manual processes while improving supplier performance. Job Description SourceDay specializes in bringing life to our customer's data through our proprietary transformations. These transformations are built, by folks like yourself, across an array of complex ecosystems. Through SourceDay's optimized subscribe and publish model you will have the opportunity to continue enhancing our data pipeline capabilities and onboard new customers at a quick pace. SourceDay needs someone who has the experience in developing data integrations of third-party solutions for various ERPs, like Epicor, Syteline, Microsoft NAV & GP, Visual, Intuitive, Acumatica, and NetSuite. Having worked with one or more of these ERPs is a significant bonus. Do you like to: • Build APIs and Interfaces for others to consume data? • Bootstrap data pipelines to be self-sufficient and easily maintainable? • Take on big data challenges by applying your mastery of capturing, cleaning, storing, and securing the data you collect? • Take pride in your stewardship of the data you collect and the security, visualization, and queryability you are able to provide? If you answered yes to all of these then you will enjoy being a part of the SourceDay's engineering team. Qualifications • Studies data sources by interviewing customers; defining, analyzing, and validating data objects; identifying the relationship among data objects. • Plans data integration process by developing common definitions of sourced data; designing common keys in physical data structure; establishing data integration specifications; examining data applications; examining data models and data warehouse schema; determining best-fit data interchange methods; assessing middleware tools for data integration, transformation, and routing; developing project scope and specifications; identifying factors that negatively impact integration; forecasting resource requirements; establishing delivery timetables. • Delivers data integration by implementing shared databases; integrating data shared across legacy, new development, and purchased package environments; developing system modification specifications; mapping data; establishing interfaces; developing and modifying functions, programs, routines, and stored procedures to export, transform, and load data; meeting performance parameters; resolving and escalating integration issues; coordinating actions among users, operations staff, and outside vendors; recommending adjustments as objectives change; documenting operational procedures and data connections. • Validates data integration by developing and executing test plans and scenarios including data design, tool design, and data extract/transform. • Maintains data warehouse performance by identifying and resolving data conflicts; upgrading data definitions; • Improves data integration by designing and evaluating new data interchange formats; improving physical design; rewriting data policy, standards, and procedures; • Maintains team accomplishments by communicating essential information; coordinating actions; obtaining expert input; reviewing open issues and action items; contributing information to team meetings and reports; transferring knowledge of data integration process, techniques, and issues to application and support teams. Requirements • Bachelor's degree in Computer Science or related field, or equivalent experience • Experience: 4-6 Years • Strong Analytical and problem-solving skills • Strong SQL and database knowledge • Good understanding of Data Governance and other Data Quality practices • Experience with leveraging and integrating data pipelines within AWS services jungle • General networking understanding • Installation and Configuration experience with various mid-to-Enterprise market ERP systems like Epicor, Syteline, Microsoft NAV & GP, Visual, Intuitive, Acumatica, and NetSuite • Hands-on experience building data processing pipelines (e.g, in Storm, Beam, Spark, Flink, Lambda) • Strong experience with object-oriented and/or functional languages (e.g. C#, Java, Scala, Go, Python) • Proficiency with metaprogramming languages (e.g. Ruby) • Deep understanding of relational as well as NoSQL data stores (e.g., Snowflake, Redshift, BigTable, Spark) and approaches • Strong experience in developing services within IIS and SQL Server Strong API development experience in building scalable services Additional Information Benefits • Unlimited Vacation • Onsite Workout Facility • Fully stocked kitchen • Taco Tuesdays for Breakfast and Donut/Kolache Thursdays
IBM MQ with SQL Server DBA and Operations Specialist
PGM TEK, Inc.New York, NY
PGM TEK, Inc. is seeking an IBM MQ with SQL Server DBA and Operations Specialist to manage IBM MQ instances and support SQL Server integrations. The role requires expertise in cloud and on-prem environments, along with strong scripting skills.
GenAI Developer and GenAI Developer
Techwavz IncNorth Chicago, IL
Techwavz Inc is seeking a GenAI Developer to build, deploy, and maintain an enterprise-level RAG application in Azure. The role requires expertise in cloud infrastructure, backend and frontend development, and AI engineering.
Release engineer
Saviance TechnologiesSan Jose, CA
Saviance Technologies is seeking a Senior Build and Release / DevOps Engineer in San Jose, CA, to lead DevOps initiatives and streamline software development processes. The role involves managing automated build pipelines and collaborating with development teams to enhance software delivery.
Job Type
Fulltime role
Skills required
C#, Java, Go, NoSQL
Location
Austin, TX
Salary
No salary information was found.
Date Posted
November 2, 2024
IBM MQ with SQL Server DBA and Operations Specialist
PGM TEK, Inc.New York, NY
PGM TEK, Inc. is seeking an IBM MQ with SQL Server DBA and Operations Specialist to manage IBM MQ instances and support SQL Server integrations. The role requires expertise in cloud and on-prem environments, along with strong scripting skills.
GenAI Developer and GenAI Developer
Techwavz IncNorth Chicago, IL
Techwavz Inc is seeking a GenAI Developer to build, deploy, and maintain an enterprise-level RAG application in Azure. The role requires expertise in cloud infrastructure, backend and frontend development, and AI engineering.
Release engineer
Saviance TechnologiesSan Jose, CA
Saviance Technologies is seeking a Senior Build and Release / DevOps Engineer in San Jose, CA, to lead DevOps initiatives and streamline software development processes. The role involves managing automated build pipelines and collaborating with development teams to enhance software delivery.
SourceDay, Inc. is seeking a Data Integration Engineer to enhance data pipeline capabilities and integrate third-party ERP solutions. The role involves building APIs, managing data transformations, and ensuring data quality across various systems.