01. Our client
01. Our client
Our Client is a Stockholm-based IT Lifecycle Management platform that helps organizations manage every stage of their IT assets. From procurement and onboarding to daily operations and end-of-life handling, the platform brings everything together through its Universal API. This API connects directly with IT resellers, MDM systems, HR platforms, and other essential tools.
The company works with clients such as ZeroNorth, ZoCom, Möller Bil, and Minna Technologies. They also maintain strict security standards, including AES-256 encryption, EU Data Processor compliance, and ISO 9001 & ISO 27001 certifications.
The Problem:
As the company grew, more customers requested new system integrations. Each integration—HRIS, MDM, Telecommunications, Distributors, or Shipment—came with its own authentication methods, data structures, and business rules. With more than 20 integrations already in place, the backlog kept increasing.
Developers repeatedly rebuilt the same setup and test foundations for every integration. This slowed down delivery, created bottlenecks, and made it harder to keep all integrations aligned with the same coding approach.
Desired Outcome:
The Client wanted to reduce the time spent on repetitive work, bring more consistency across integrations, improve reliability, and support ongoing demand without compromising quality.
02. Challenge
02. Challenge
Client’s Universal API needed to handle many different types of integrations, each with specific requirements. This led to several challenges:
- High variability across integrations: Every system used different authentication methods, mapping rules, and error-handling needs. Developers often rebuilt similar foundations from scratch.
- Inconsistent patterns across the codebase: As integrations increased, small differences in implementation became harder to manage and maintain.
- Too much repetitive boilerplate work: A large portion of integration development involved recurring steps that did not require deep domain knowledge but still consumed valuable time.
- Longer onboarding for new developers: Understanding existing integrations and internal patterns required significant time before contributing effectively.
- Rising customer expectations: More clients asked for additional integrations or modifications, putting pressure on the team to deliver faster.
Clients needed a development approach that was predictable, structured, and scalable in order to keep up with product demands.
03. Cooperation
03. Cooperation
To solve these issues, the team introduced AI-assisted development gradually. The goal was to improve the process without pausing ongoing feature delivery. Instead of a complete shift all at once, the team implemented small, practical updates that could be tested immediately on real tasks.
This step-by-step approach allowed the team to validate results, adjust rules, and expand AI involvement with confidence.
1. Cursor Rules Creation
The team started by documenting the most complex integrations. These integrations contained detailed mapping rules and error-handling logic. Turning them into clear AI-readable rules became the foundation for reliable AI assistance.
Once the core logic was captured, the team expanded the rules to describe architecture, naming conventions, folder structure, and expected behavior across the entire repository.
2. Test Coverage Expansion
A high level of test coverage was essential for predictable results. By raising coverage above 90%, testers provided the AI with clear behavior definitions. The tests outlined not just expected outputs but also domain-specific rules that guided AI decisions.
3. Workflow Integration
Cursor was connected to the team’s main tools:
- Linear: AI could review code, spot issues, and create tickets automatically.
- Slack: Developers triggered tasks by tagging Cursor or linking a ticket.
- GitHub: Cursor submitted pull requests, wrote code, and added review comments.
This created a development flow where AI could support every stage—from task creation to code reviews.
4. Autonomous Enablement
As the rules library improved, Cursor gained the ability to complete straightforward integrations independently. It handled setup, endpoints, validation, and tests based on Client’s expected patterns.
Developers focused on reviewing and refining the final result instead of writing repetitive code.
5. Continuous Refinement
In daily development, the team observed AI behavior, gathered feedback, and updated the rule system. This ongoing improvement increased accuracy and allowed the AI to take on more work over time.
04. Solution
04. Solution
Teamvoy built an AI-assisted development workflow around Cursor to reduce repetitive tasks, bring structure to all integrations, and shorten delivery time.
Core Innovation: Cursor Rules Architecture
The team created a structured rules library inside .cursor/rules/. This became the central source of knowledge for the AI and included:
- Definitions for each integration category
- Authentication logic for OAuth, API keys, and custom tokens
- Data mapping patterns
- Error-handling and retry expectations
- Testing patterns with Jest and Nock
- Coding and naming standards
- Guidelines for repository structure and configuration
With this library, AI could implement new integrations using consistent and predictable patterns.
Key Components of the Solution
Intelligent Development Context
Test coverage above 90% gave AI a reliable reference for expected behavior. Combined with the rules library, AI could follow both high-level logic and detailed instructions.
Integrated Workflow Automation
The development flow became tightly connected:
- Developers created or referenced tasks in Linear
- Cursor analyzed the task and the repository
- Cursor generated the necessary code and opened a pull request
- Cursor reviewed the pull request
- Developers provided final input and approval
This loop allowed the AI to take on most of the mechanical work.
Workflow Example:
- A developer tags Cursor in a Slack message with a task.
- Cursor analyzes repository context and rules.
- Cursor implements the task and opens a GitHub pull request.
- Cursor reviews its own pull request, adding comments.
- The developer reviews and requests adjustments if necessary.
- The PR is approved and merged.
06. Info
06. Info
Services:
AI-Assisted Development Implementation
API Integration Architecture
Development Process Automation
Industry:
SaaS / IT Lifecycle Management
IT Asset Management
Integration Categories:
HRIS
MDM
Telecommunications
Distributors
Shipment
Asset Management
Portfolio Category:
AI, System Integration, AI Agent
Tech Stack:
Node.js,
TypeScript,
Express,
Nx Monorepo
Testing:
Jest
Nock
07. FAQs
07. FAQs
How do you build an AI agent?
To build an AI agent, you define the tasks it should handle, provide rules or examples, and connect it to the tools or APIs it needs. You then test it on real scenarios, refine the logic, and improve accuracy over time. This is exactly how Teamvoy shaped its Cursor-based agent.
What is an AI agent?
An AI agent is a system that can analyze instructions, understand context, and take action. It can write code, process data, or automate tasks. It behaves more intelligently than a simple script because it relies on contextual reasoning.
Can I create my own AI?
Yes. Modern tools let you create an AI agent without training a model from scratch. By defining rules, providing examples, and connecting APIs, you can build your own agent tailored to your processes.
How do you train your own AI?
Training depends on your goals. You can:
- Provide prompts and examples
- Build a structured knowledge base
- Connect tools for deeper functionality
- Or fully train a model using large datasets
Most teams reach strong results using rule-based and example-driven training, as Teamvoy did.

