Interview with Founder - Aiswarya Sankar
Building AI-Powered Engineering Tools
Building features in large, complex codebases requires deep understanding of system architecture and context. At Entelligence, we're revolutionizing how engineers interact with and understand their codebases through artificial Entelligence.
The Evolution of AI in Engineering:
The journey of AI in software development has evolved through several key phases:
-
The ChatGPT Phase
"As soon as ChatGPT came out, everyone was super excited that you could stick a chunk of code into GPT and get something else out. It was able to come up with something remotely helpful, but people quickly realized doing these things in isolation can make you a better Stack Overflow, but it's not going to actually help you build something in your system." - Entelligence Founder
-
The Autocomplete Era
"We saw the whole hype cycle around autocomplete solutions and Copilot. What that was able to do is now it's in your editor, able to fill in a little bit about predicting what you're trying to do, understand some things around the before and after for that code."
-
The Context-Aware Future
"What people started to realize is that in order to actually write real software engineering code, you need to understand a lot of things about how the system works, and none of these solutions right now are actually able to do that."
Key Technical Capabilities:
Entelligence's platform is built on two fundamental pillars that address the core challenges of modern software development:
-
Context Awareness
Our system goes beyond simple code understanding:
- Integration with Language Server Protocols (LSPs)
- Graphical representation of code structures
- Time series awareness of codebase changes
- Team member contribution tracking
- Integration with PR and Slack communications
-
Intelligent Planning
We help engineers make informed decisions about how to build and modify systems:
- Comprehensive onboarding guides
- Visual diagrams of code architecture
- Custom command support
- Multi-codebase integration
Security and Privacy
We understand the importance of protecting intellectual property and sensitive code. Our platform offers:
- Full self-hosting capability
- Containerization for AWS marketplace
- SOC 2 certification in progress
- Per-organization model tuning without cross-domain sharing
Technical Architecture
Our platform leverages multiple AI models for optimal performance:
- Claude 3.5 Sonnet for core logic and reasoning
- Specialized models for specific tasks like issue solutions
- Custom-trained models for understanding organization-specific patterns
Current Status and Growth
Entelligence is rapidly expanding its capabilities and reach:
- Team of six based in San Francisco
- Recently completed seed funding round
- Supporting thousands of engineers
- Processing ~340 PRs daily for one customer
- Partnering with major open-source projects including Assistant UI
Integration Capabilities
Our platform seamlessly integrates with your existing workflow:
- VS Code extension for direct IDE integration
- Slack and Discord integration for team communication
- Support for multiple codebases and third-party tools
- Custom command creation and automation
Full Interview Transcript
Interviewer: So who are you?
Founder: Hey, I'm Aiswarya. I'm the founder of Entelligence.
Interviewer: And what is that?
Founder: At Entelligence, we're working on building artificial engineering Entelligence. So a little bit about me first and how this came about - before starting Entelligence, I was at Uber for about four or five years and what I realized is that like a lot of tools right now - at Uber we were using Copilot for a while - and realized that all these tools are trying to replicate what engineers do. If you look at some of the top-performing engineers, what they were really excelling at was this kind of domain knowledge. They're people who are domain experts in different areas, which means they had a comprehensive understanding of how previous things were built, what different teams are doing, what teammates were doing, actually reviewing PRs and having a good understanding of the entire architecture and system. That simply isn't the case with current AI developer tools. So I wanted to set out to build this kind of system that would be able to help all engineers with the context and building things that they were doing at these large companies.
Interviewer: AI engineering is a different ballgame than writing C++ code in the old world, right? This is a new world. Tell us a little bit about what you're seeing and what your company helps engineering teams with.
Founder: I think there's been a few phases with using AI in engineering. I think as soon as ChatGPT came out, everyone was super excited that you could stick a chunk of code into GPT and get something else out, or you could ask it to write a basic app. It was able to come up with something remotely helpful, as well as you could ask it "hey, this is some error message, what can you do about it?" But all of this, I think people quickly realized - doing these things in isolation can make you a better Stack Overflow, but it's not going to actually help you build something in your system.
That was kind of the first phase, and then we saw the whole hype cycle around autocomplete solutions and Copilot and everything. What that was able to do is now it's obviously in your editor, it's able to kind of fill in a little bit about predicting what you're trying to do, understand some things around the before and after for that code.
But what people also started to realize is that in order to actually write real software engineering code, you need to understand a lot of things about how the system works, and none of these solutions right now are actually able to do that. I think now the next wave of these solutions are, as I said, how can you replicate how engineers are actually thinking about these solutions - which is understanding how this architecture works, knowing what these different components do, how different people are collaborating, and not only help write the code but help design the solutions, help review it, and kind of make sure the entire system is working together. So that's kind of the next phase of how AI can help engineers, and that's what we've been working on.
Interviewer: I was talking to Dylan this morning - he runs an R&D group at Unity - and he's saying that this new world requires a different architecture, different thinking about architecture. Do you agree with that at all? And how does your system help get people into building? Like we're having to learn new ways to do things like RAG, retrieval systems. Give us a little taste of what you think of as new architectures.
Founder: I would say there's really two large problems that pretty much any company building in the dev tool space has to learn and solve. One is this context awareness issue. There have been approaches with RAG, there's a lot of improvements obviously you have to make on top of it to make these systems work on top of code. We've seen a lot of things - one is integrating LSPs, being able to actually understand the code structure, being able to do graphical representation if you're trying to draw diagrams, how exactly are you doing the different layers for these code bases.
So that entire thing works into this context awareness, and it takes a few steps beyond just code. Again, if you look at current solutions, a lot of them do try to take in the entire codebase for context, but they're missing very critical things like time series awareness. If I work on a codebase, go work on something else, come back - I want to know what all has changed and how that's going to affect my future decisions. Getting recaps on the code, not only that but understanding when you're working in a team - it's very important to know what other people are doing. So is your system actually aware of individuals, how they're contributing, what's changing?
All of those things are a part of this first part of the problem, which is context awareness - do you understand how the codebase, how the org is working? If someone mentioned something in a PR or in Slack, does your system actually take that into account when it's making decisions? That's what engineers at engineering orgs do today, but most AI systems have none of that information.
I think the second part of the problem is really planning - can you take that information that you now know about the system and use that to actually decide how to build things in a smart way? Both of these components are super critical to make systems that can actually work somewhat similar to how engineers are able to actually build things.
Interviewer: You're building like a pair programmer that would sit next to me, an AI that would interact with me right? Tell me what that pair programmer would do compared to a human pair programmer.
Founder: Yeah, so as I said, we really model this off of what I was seeing at these large companies. There's a few key tech leads that do this - they understand the entire system, they're the ones giving people feedback, mentorship, they're kind of the ones who review and sign off on different design decisions. And I was thinking how can we actually replicate something like that? Because if you think about it, they have a few key things that they're able to do.
We don't want just another human - we want an AI that actually has this full system awareness. There's a lot of simple low-hanging fruit there. If you think about it, a lot of times engineers that work at these large companies think about "I want to build this feature" - probably 10 people have done this before. Half of the problem there is just trying to find where those PRs are, who to reach out to, who to ask. That's one of the most common things that happens when you're trying to solve a new bug - finding who did it, asking someone, figuring out how to orient.
Right now, yes, a tech lead who's been there 6-8 years probably knows that information, but if you can build an AI system that's actually robust enough to understand the entire time series of how this codebase has changed, what are those similar issues in PRs, it can point that out to you. So that's one of the things that we support now - you can search for similar issues or PRs in our platform and it will help you kind of walk through "okay, this is something that's doing something similar to the ticket you're working on."
Founder: Other low-hanging fruit, as mentioned - this isn't necessarily low-hanging fruit - but also being able to generate visualizations or diagrams. A lot of coding, I think in my first CS class in Berkeley they really highlight that coding is a lot about abstractions. If you're trying to look at the system as a whole, you do not want to understand how function A is implemented. There's kind of a lot of different abstraction layers, and when you're thinking about the system, you need to abstract away all the details you don't need to know so that you can focus on what's important and kind of go back and forth between these layers of abstraction.
So that's really something we also need to get our AI systems to be able to do. We've thought a lot about this when we generate diagrams - how can we use LSPs which will give you all the granular details about this function calls this with this class name under this thing, and actually zoom it out to a level that humans can understand like overall the big picture, and then kind of go in as needed.
So we don't want a pair programmer that just kind of like will walk you through "okay this line of code doesn't seem right." We want actually something that can give you that overarching understanding of "these are relevant things, this is how the system has been changing, if you want kind of an overview this is where you should look" and that's kind of what we think engineers will want as they interact with AI and engineering.
Interviewer: It sounds like I use Limitless on my computer to record all my meetings and it really didn't become useful for a few weeks right because you had to get enough things in there for you to start seeing the usefulness of the search features and talking to it and stuff like that. And it sounds like the same thing is true here but also you're trying to - your AI system is trying to really understand the codebase over time. So first of all how long does it take to get set up for a team let's say I have five engineers and second of all how long does that team really have to give your tool to really get into it and understand it and for it to understand your codebase and really start working?
Founder: That's a good question. As I said, what we actually do is work with most companies that are not starting from scratch. We actually go in and ingest everything historical. The great thing about GitHub about code is that you have pretty much a sequential, an entire historical road map of how this code base came to be. So we're actually able to take all of that information into play so that if there is an issue in the past, we have access to that from day one.
So a lot of that kind of just knowledge, understanding of how your system came to be - we can just set it up from the very beginning and then as you keep developing on it we keep our entire thing up to date. But I think on top of that, just like everything that's in your code today, the other things is you're actually able to also feed it in things like your style guide, how you prefer code to be written, etc. All of those things also factor in so it's able to kind of understand what you're looking for in your codebase best practices.
So this really factors into our PR reviews and kind of issue solutions is understanding how you as a team, what you're looking for, so that it's able to kind of guide you the way you and your org have already found best.
Interviewer: Will this comment your code as well? Because you know commenting is always a contention with developers - some people don't like to do it, some people really like to do it but they have their own style, and if you're working on a team you really need to stick with a standard way of commenting your code because somebody's going to have to use that code later.
Founder: Yeah, so what we found to do is - we've started off first just building those two core APIs that I mentioned which is the context awareness and planning and then now we have been over the last like month or two we've really been plugging that into all the interfaces that engineers need the most.
It started off with the website so if you're trying to onboard to the codebase we give you an entire onboarding guide you can understand like issues we do commented directly on your PRs. But recently we've also been expanding more of our VS Code interface. So within that you can have kind of all the functionality of highlight code explain it, add comments, you can visualize just a section of the code we have kind of very robust commands now.
As I said everything around search for an issue to search for relevant PR etc. are kind of all different commands that we've set up on our platform. So yeah, pretty much within your codebase you can also ask it to do and you can also build custom commands so if you want just something that goes in and comments every function, you can set up a custom command to go ahead and do that for you.
Interviewer: Sort of a similar question but a little bit deeper - what about testing and making sure that the system doesn't regress or anything like that? Do you write test cases for us and hook into a CI/CD? Do you work with other test suites? Because I know some teams have their own test suites that they hook into and stuff like that.
Founder: Yeah, yeah of course. How about I actually show you?
[Demo section follows with VS Code walkthrough]Interviewer: Wow all right so some things that are like - a lot of enterprises are really nervous about intellectual property and obviously you're giving it direct access to your intellectual property so tell us a little bit about how you think about privacy and do you use this to train another model in the future you know are you listening are you watching my screens you know all the kinds of conspiracy theories that people have "oh my God I can't do this stuff not at work."
Founder: I think as I said, security is super important in the space. I think as you mentioned, as soon as you mentioned code, that's kind of one of the first things that comes up. So we actually built our entire system from the very beginning to be fully self-hostable compatible. So essentially the entire thing right now you can either - we have an app on both AWS Marketplace and soon on other marketplaces so that the entire thing is kind of containerized and you can run it on your own servers.
If you are working at a larger company I think that's usually what works best. But in addition we're also in progress of getting our SOC 2. And how we look at security as far as fine-tuning models - again those are all kind of only per org. You get full access to the model. We do kind of tune on your previous issues and PRs in order to do the issue solutions because that is very customized per codebase per org. To become a domain expert you really need to fully understand that org setup but again obviously there's absolutely zero kind of sharing cross-org or cross domain and if you're self-hosting it we don't actually end up seeing those models.
Interviewer: Tell me a little bit about your business. Who are you working with, how many employees, how are you funded, the usual stuff about business?
Founder: Right now we're a team of six based in San Francisco. Pretty small team - we just raised our seed actually so we will be disclosing our primary investors pretty soon. And yeah, we've been working with two design partners and we have several thousand engineers using the platform both on VS Code and our UI.
We actually recently kicked off a benchmark hackathon, so that's kind of where this agent list came out of - just helping engineers trying to understand all these different code bases. And with that we actually also put out our first these two APIs that I mentioned for if engineers want to continue to build on top of it.
Right now as I said, our first design partners have really been focused on both the PR reviews so we have one of our customers does about 340 PRs a day using our system. VS Code is pretty new but we've been adding all these new functionality and are looking to start promoting it more very quickly.
Interviewer: Where do we learn more about you?
Founder: You can check us out at our website Entelligence.ai. I can drop links for our Discord.
Interviewer: Thank you so much.
Founder: Thank you.