Skip to content

Anti-platform

At kis.ai, we understand that the market is saturated with platforms, and our customers don’t need be forced to add another one. We believe that any good technology should be magically useful while being unobtrusive and not imposing constraints. This philosophy is embodied in our “anti-platform” approach.

Each block within kis.ai is designed to function independently. Customers can use these blocks without adopting the entire platform, allowing for flexibility and ease of integration into existing systems. However, when used together, these blocks seamlessly recognize and integrate with each other, like old friends.

This interoperability ensures that while our technology remains unobtrusive, it provides powerful, cohesive solutions when needed. The anti-platform approach ensures that our technology is both adaptable and powerful, fitting smoothly into our customers’ workflows without imposing additional complexity. By prioritizing utility and ease of integration, kis.ai’s blocks enhance productivity and innovation without the burden of a traditional platform’s constraints.

Incremental Adoption

Lets see how two of our customers have adopted kis.ai incrementally.

Need help with Backend APIs

A recent customer of kis.ai, was having a problem building new backend APIs quickly. They have recently, also started building native mobile applications and the mobile developers were frustrated as for every change in API request would take a minimum of 2 days and sometimes more than a week, with any change requests delaying it further.

When we demoed the platform, their Engineering Manager zeroed in on our Data API and BFF blocks. They quickly generated Entity YAMLs for their existing database, created product with these YAMLs and only two blocks Data API and BFF and deployed it in their environment with access to their development database.

These services generated REST and GraphQL APIs for all entity YAMLs without any code, and fully secured with their existing IAM solution. Mobile developers used te end points and start building out their solution 4x faster, as they did not have to request new APIs. Whenever, they needed multiple APIs to be stitched together as ane single response for improving frontend performance, they leveraged the BFF service, without needing any backend-developer time.

Control AI Experimentation and Costs

Another customer of kis.ai, were experimenting with OpenAI APIs and wanted to also get started with their local open source models like Llama3 or Mistral in future. The current problem was that they had multiple teams using OpenAI’s APIs in different projects and it was becoming tough to manage their usage and keep the spend predictable. IT team also was not happy with sharing API keys directly with the development teams.

In this situation, the customer was excited about the possibility of our AI Gateway block and started using just that. The IT team of the customer creating specific endpoints for each dev team with specific rate limits. Here they did not have to share the Open AI API Keys with the teams, rather they were authenticated with their corporate SSO. The prompt engineering, A/B testing and observability features of the AI Gateway block, helped the dev teams improve productivity, and experiment faster, while keeping costs in control.