Machine Learning at the edge is gaining steam. BrainChip is accelerating this with their Akida architecture, which is mimicking the human brain by incorporating the 5 human senses on a machine learning-enabled chip.
Their chips will let roboticists and IoT developers run ML on device for low latency, low power, and low-cost machine learning-enabled products. This opens up a new product category where everyday devices can affordably become smart devices.
Rob Telson
Rob is an AI thought-leader and Vice President of Worldwide Sales at BrainChip, a global tech company that has developed artificial intelligence that learns like a brain, whilst prioritizing efficiency, ultra-low power consumption, and continuous learning. Rob has over 20 years of sales expertise in licensing intellectual property and selling EDA technology and attended Harvard Business School.
Links
——————–transcript——————-
Abate: Hello, welcome to the robohub podcast. This is your host Abate, founder of fluid dev a platform that helps robotics and machine learning companies scale their teams up as they grow. I’m here today with Rob Telson.
The VP of worldwide sales at BrainChip. So welcome Rob and honor to have you on here.
Rob: Abate it’s great to be here and thank you for having me on your podcast.
Abate: Awesome. Could you tell us a little bit about what you guys are doing at BrainChip and what your role is over there?
Rob: absolutely. So, you know at brain chip, the way we’re approaching the world is we are revolutionizing AI. For edge based devices and the world of IOT moving forward. So we’ve developed a processor based off of what we call the neuromorphic architecture and the whole, the whole function is to basically mimic the brain and by mimicking the brain and the way we function.
We’re gonna consume about five to 10 times less power and energy in the world of processing information compared to how traditional AI processors work today. So when we think about, you know, IOT devices or edge based devices, we’re talking about anything from wearables. To the future, which is basically electric vehicles or flying taxis or anything that we can’t even fathom at this point in the world.
But these are, are our devices and applications that basically require, very low power consumption and you know, by implementing a intelligence into these devices and applications, That’s where brain chip comes into play. And what, what makes us extremely unique and differentiates us is we’ve developed our product to not only function like the human brain, but also to focus on what we call five sensor modalities.
And those sensor modalities are our vision hearing. Or speech taste, smell, and vibration. So in the world of AI today, most all solutions or applications develop the process information for artificial intelligence or focused on vision or object detection or image action and hearing. And speech.
So by us approaching this from the five sensor modalities, we’re, we’re starting to introduce. New functions and new ways that allow you to, to really address a lot of different dynamics out there when it comes to the devices that we’re, that we, we use today as consumers or in our everyday life. My job at brain chip is I’m responsible for worldwide sales.
And so it’s really about communicating the story. It’s really about getting companies to adopt our technology and addressing it from that end. So these are very exciting times for brain chip.
Abate: Awesome. Yeah. And so also for robotics, it’s very much the same thing. Vision has been a very large portion of a lot of the development that’s been done. And people are very visual people. So it, it makes a lot of intuitive sense to deal with visual data. what can you just dig into? What was the reasoning that you decided to go after some of these other senses and what are some of the benefits that this can bring longer term for say robotics or other ML applications?
Rob: Yeah. And we look at robotics just to, just to address that real quickly as a key area in which, you know, our technology is gonna be extremely impactful as robotics evolves over time. But really the drive on the, the ability to address the five senses was the, the, the architecture of the technology and our ability with brain ship and, and our, our product is called a key and our, our, with a key, the way it processes inform, and it allows you to function and focus on.
These other senses that traditional AI architectures might struggle with. And the reason why we say that is because traditional AI has to, to take all information that it gets and it processes it all at the same speed and performance and power consumption. And what I mean by that is, you know, we’re talking, you’re looking at me, so you’re using your vision.
Um, your hands are probably resting on your desk or something to that extent. So you can feel or touch, you might have some coffee or something that’s brewing in the background. So you can smell, which is another sense, but really you’re focused right now. Now is listening to every word that I’m saying, and I’m hearing your brain’s processing all of this.
At the same time, but it’s consuming most of its energy on the listening. That’s different than now an AI processor works, but with the neuromorphic architecture, it functions the same way it spikes. So it understands what events are actually needs to focus on. And right now the event of listening is where it wants to put all of its energy E indifferent than smell.
Rob: So that’s, that’s why we’re able to address the five senses. That’s why we’re able to really look at the, the evolution of AI. And, and, and let’s just talk about it from a robotic standpoint. So now you can have, vibration detection. And vibration detection to for machine machinery purposes or, or other aspects from that end or, robotic sec, not only can recognize vibration, but smell.
and for gas leaks or other applications from that end. so there there’s a lot of functionality that can take place. And in most of these applications where you use AI today, you might have to put down multiple chips to focus on the different, functions that you would incorporate into your solution with brain chip Akida.
Again, because we’re spiking. and we’re focusing on events. we can focus on different modalities, all on one device. Again. So when we look at building out systems, when we look at building out technology, it, it puts us in, in a revolutionary position because of the fact that not only are we more efficient on power consumption and on performance the amount of land or landscape that we’re taking up on a device is my much less.
Abate: Yeah, And the, the power efficiency, especially for something like robotics is something that’s really critical. so, what, what is the power consumption of the the SOC and how does this compare to the competitors and what are some of the things that this unlocks maybe for robotic, but it sounds like also this is very large for IOT.
Rob: Yeah, so so good question. When it comes to power consumption, you know, we’re processing these functions in microwats to milliwats. So for example, if you go to our YouTube channel at brain chip, Inc, you can see some of the demonstrations that we’ve put in place. And one of ’em we call. The smart cabin of a vehicle of the future.
And in that demo, what you have is you have someone sitting in the driver’s seat and you have a key to recognizing who that individual is by name. And saying, oh, I know who that is. That’s Rob, he’s sitting in the driver’s seat. And then I decided to speak and I say, Hey, Akida. And it recognizes my voice.
And then it also recognizes that just someone is in the vehicle. it’s designed to demonstrate that, you know, what, if four people were in the vehicle, it could recognize all the voices. It could recognize all the, the names and it could recognize who they are. And then behind the scenes with some of the other applications you’d put in place, you could have all the preferences for each of those passengers within the vehicle, for each one, but, but what, what makes us different?
What makes it so exciting is that the amount of power we’re consuming to recognize someone’s face is 22 mill Watts. The amount of power to recognize someone’s in the vehicle is six milliwatts. The amount of power consumed to recognize the voice. Is less than a hundred milliwats. Okay. So now you take all of that and in a competitive environment, one of the main technologies that’s been implemented today would be in the tens of Watts.
Abate: And then just to, to give a picture of you know, what is that number of milliwatts? Like what, what would that compare to how long would it take if you were to take your iPhone charge or plug it into the wall to charge that much amperage
Rob: That’s a great question. And unfortunately I don’t get into that much detail, so I’m not the right guy to answer that or I’d probably do it. I would probably screw it up and I’d have some of my guys behind me saying, what were you thinking? So I kind of stay out of it when we get, get into the, the, the.
The depth of the technology, but the, the, the purpose of the, of the demonstration is really to highlight the fact that the technologies that are being implemented to do this today are consuming, you know, Tens to hundreds of times greater than what we’re capable of doing with our technology. And that’s what gets very, very exciting.
So we’ve seen some announcements over the last month in, in which companies have said, Hey, look or specific company has taken our technology. They’ve validate it within their vehicle for keyword spotting. purposes where they’ve been able to, to, to work with it. And it’s compared to what their current solution is today.
They’re seeing results of five to 10 X, less power being consumed. And that’s very exciting as we move forward in this goal of for example vehicles to be able to go , a thousand miles, you know, on a charge or phones that would be able to last. Three to five days with on a charge. Those are the types of things where you’re gonna see new technologies such as what we’ve designed with Akida start to change the way, devices are architected, and which will allow us to have a lot more freedom and, and, and, and, and flexibility from wearables all the way through to new devices that will be introduced.
Abate: Yeah. And you know, some of the other products that are gonna be unlocked by using such a low amount of power is the ability to say, take sensors or take small computers send them out on single use batteries and then leave them out there for one year, two years. the way we’ve seen with. GPS trackers and, and what that has unlocked.
Um, so yeah, I mean, it definitely a lot of really good applications that this does for robotics and ML. and you know, this is gonna be pushing the shift from doing a lot of processing in the cloud down to at the edge. And we, when when you’re doing your ML algorithms and you’re doing it for the cloud, and now you’re thinking about how we do it for this ultra low power consumption.
Device how, what, what changes for the developer who’s making the algorithms? What are the limitations that they’re gonna have now that they’re working on this very low power device, but it’s here and it’s local.
Rob: Yeah, the, you know, great question. And what I wanna highlight is the fact that traditional AI today, and just as you brought up, it’s processed on the cloud or at the data center level, however you look at it and The AI architectures of today, they’re really beefy. I use the word they’re a beast.
Um, they consume a lot of energy. They consume a lot of power. They process at very high performance. they don’t have any constraints and in the world of technology, you know not having any constraints gives you a lot of freedom to, to flex your muscle. but when we talk about away from the cloud, and we talk about being on the device and being able to process on the device what you want the end goal is not just to process on the device it’s to be able to process on the device without having to depend on the cloud, by processing information back and forth.
And so what I, what I mean by that is let’s take a home assistant or a voice assistant on our phone today. For anyone out there that has ever said, Hey phone, and, and let’s use the word, Siri, Hey, Siri and Siri responds back with, I can’t help you right now. Or I’m UN unavailable right now. It’s struggling to communicate off device to the cloud.
And back to you. Now in a normal world, Hey Siri. I wanna go to the closest restaurant and immediately it says there is a hamburger place, you know, half a mile from here. Do you want directions? And you hit the button. Yes. And you move on with life. as a user, we’re not impacted by that, but the other avenue, I just talked about where it’s unavailable.
We are impacted. Now I wanna magnify that and I’m gonna get to the end goal here with the question in a second. But I’m gonna magnify that when we start thinking about our dependency on these devices to help us with directions, help us solve a problem, help us in a, in a variety of different ways or entertain us with music and video.
And all of that right now goes off device to the cloud. Let’s think about the electric vehicle now and a critical situation. And the vehicle has wants to reply to the vehicle has to make a decision. The vehicle can’t make a decision because it can’t get access to the cloud. And so those are the things that concerned us.
So when we designed Akida we developed it. So it can process on the device without having to go to the cloud. okay. And, and what that enables you to do it does give you a lot of that freedom and flexibility and a ton of functionality. But the other thing that it does, it provides a level of privacy and security.
So when we’re in critical scenarios and we’re processing information, it’s not going to the cloud, or it goes to the cloud and batches at the end of the, the day in a secure environment. but it also allows you now look at the devices from these. These wearables all the way to the vehicle or, and I use vehicle only because we can conceptualize with it.
Um, it allows you to start making critical decisions and getting an immediate response and again, doing it with five to 10 X less power. The other thing that gets very exciting about what we’re doing with Ikeda is it’s the on chip learning or what I call device personalization. So in the world of machine learning, you talked about, you know, you have to develop these networks.
You know, let’s just use TensorFlow, for example, you traditionally develop your, your convolutional neural network in TensorFlow. You validate it and so on, and that can be a process that process could take six months, nine months a year to pay on how complex that network is. What we’re doing with the key is we’re able to do edge based learning or on device learning.
So I can capture your image, your voice, other aspects of who you are without having to design you into the network.
Rob: Okay. And so not only that, and again, I’m using just the in cabin of a vehicle, cuz we can conceptualize it. I want to add three passengers or drivers to this vehicle by voice and by image and other aspects.
Uh, again, a key to learns them on the fly without having to develop a new network. And now take that one step further. And you were talking about robotics and let’s talk about robotics on the shop floor. Let’s talk about, you know, their ability to sense things. First, we want it to sense a gas and we’ve trained it with a network.
For smell, but I wanna add a new smell to it. I wanna add smoke, which isn’t in the network. We could teach it smoke without having to go through this whole machine learning process of redeveloping, the network with vibration and taste. That’s where it gets really exciting. And we, we hit these, these endless opportunities of, of introducing intelligence in areas.
We, we, we didn’t think we could do for a while.
Abate: Yeah. Yeah. And you know, not to mention when you’re doing all of these things locally, you’re now not uploading a lot of data to the cloud and then back down, and then this becomes a big data hog. and then the people who are able to train this say on the shop floor, These are not engineers anymore.
These are, these are regular users.
Rob: Yeah. So, so it it’s funny because we we’ve again G going to the YouTube channel where we have all of our content, or even going to our website@www.Brainchip.com. You can access all the content as well. you get it, you get this feeling for that, that it’s simple to do this type of training. so simple that I can do it.
I’m not the, the sharpest tool in the shed. I’m not a machine learning expert from that end, but that’s the intention. And you know, the exciting thing for us is we’ve just launched our development systems and our PCIE boards in a, in a very small form factor. So users of all levels, those are curious, or actually have an application where they’re trying to get solve something and introduce AI can get access to our raspberry pi development systems.
Uh, they can get access to a shuttle PC development system, or they can get access to our PC board and plug it into their own environment. So I call it the whole intention was plug and play. And as we sat in the room, architecting, how we were gonna go about doing this? I, I said, okay guys, at some point we have to have a product that I can use, and if I can use it, I can plug it in, turn it on and start playing with it.
Then I know we’re gonna be successful.
Abate: Just to also step back a bit so, you guys are producing these system on chips, you’re producing some development boards as well. and the technology that you’re making, you’re also licensing out to companies so that they can design it directly into there systems.
Rob: Yeah. So, so, you know, we have, we have a variety of business models, but the, the key, the key focus of the company is really about. Enabling as many users and future users as possible to get access to AKI and in the environment. And so at the end of the day, though, when you look at companies that are designing technology, most companies are developing their own systems on a chip.
And in order to develop a system on a chip, you have to have technology, you can integrate into it. So we we’ve started with, Hey, let’s license, our, our key to processor as IP or actual property, you can design it into your S SOC and it can be configured in a variety of different ways. from that end.
And then on top of that we have our, our development boards. we have our development systems, so users of all levels can get access to that technology and start working with it. And then we have our chips available. So if those didn’t want those that did not to design in the IP and do an SOC, they could get access to our chips, put them on in their, in their own environment.
Put it on a board and start working with it. And all of this is predicated on having a very simple development environment to work with. So we have our own development environment called meta TF and, and, and meta means post TF means TensorFlow. And so you can go to www.brainchip.com/developer and log into meta.
Start using our environment, run through a ton of examples that we have, or integrate in your own network and actually optimize it. So you would know how your network would work within the Akida environment. What type of power consumption, what type of performance? All the aspects of going through a traditional simulated environment.
Um, so, we’ve so, so to me, and just like you said, awesome, those that know, understand this. No. Wow. Because what you can do is at no cost work in the akida environment, through meta TF and get all your work done and then decide, okay, how do I wanna implement this?
Rob: Do I wanna go on chip? Do I wanna, you know, put it into an SOC and get it all done without having to make a major investment.
Into the technology. So it’s all there. And we launched meta back in April of 2021 and between April and December 31st, 2021, we had over, we had over 4,600 unique users start taking a look at meta TF, start playing with it. And to us, that’s what was really exciting because the more users.
Start going there. Start getting interested in it and start to learn how to use it. we just look at that as, as just the, the first phase of the proliferation of our technology.
so I I’ve actually gone through this process pretty recently, where my team at fluid dev, we were helping a customer decide what chip set they wanted to use for their ML enabled product. and you know, the process that you see when you get to that stage is there’s so many options out there.
Um, there’s and, you know, they all have their slight differences and you have to dig through data sheets and you have to try and figure out like, why this one, why that one? And of course, when you’re designing a product, the first thing you want is to be sure that you have the best thing long term for your, for your product.
So you end up making, you know, Google sheets with like a, in different attributes and you’re comparing all of them. but you know, with that in mind, how would you advise people to. One find out what is the chip set for them? you know, even before you go through something, which is a really great system Meta TF where you can sample before you buy, but how, how do you pick, what is the best chip set?
How do you pick you know, what is good enough and make these comparison?
Rob: That’s a really good question. And I think that, that just my gut says, as I’m going through this, on the selling side, trying to it’s really about education. And I think you’ve probably experienced this as well when it comes to implementing AI right now, we’re at the forefront of learning. What technologies to use what, what machine learning platforms to, to develop our networks on and so on.
And there’s a lot of different ways to go about doing this. what I try to that you know, talk to my customers about is what you’re doing today is not where you’re gonna wanna be tomorrow.
Rob: And so you really need to look at the architectures that. Aren’t just solving what you’re trying to achieve today, but they have the flexibility to get you where you want to go tomorrow.
Because when we’re talking about technology, we’re talking about moving at very rapid rates. And if you’re designing an SOC, for example, you know, there’s a good chance that, that you’re not gonna go to production with that product for, you know, a year to two years. So now you go to production and you’ve spent all this time with some AI engine or processor.
Does it meet your requirements for the next two years after that? Or you have to reevaluate reanalyze and so on. So really understanding the roadmaps of where the technologies are going, I think is really important. And then understanding the platform the, and you’re designing your networks on or how you’re integrating that.
And that’s what I’d be looking at, you know, and what I’ve experienced is. Although most companies today would say, look, I’m I, as you said, with robotics, it’s about vision. but as, as we’re talking, I’m sure you’re saying to yourself, wow, there’s so much you can do outside of vision. If you could take a key for example, and integrate it for voice and vision
Rob: just think, wow, the third generation would be voice vision.
And by vibration the three vs. So, and then what about smell? I mean, so I look at it and say really. It’s it’s, it’s important to address what you need to address today, but where are you going and how are you gonna get there? And if you start having that discussion, you start looking at the roadmaps, the architectures of the technology.
That’s when you start to see there’s some very powerful solutions out there, not just brain chip that can take you variety of different directions.
Abate:I think you touch on something that is a feeling that everybody who’s designing a product feels this idea of where as is this still gonna be good enough in two years with technology moving? Like, are we gonna be held back by this? or are we gonna be able to swap what we’re using currently today?
Out with the next generation. and you know maybe, well, what is your roadmap? are you guys going, you guys have a couple products out right now, Ikeda. what happens in two years?
Rob: Yeah, we have a very robust roadmap and again, it gets really technical. but the way I like to look at it is, you know, we’ve started at a point and we’re gonna go up into the right. with some more, with some products that are extremely powerful and can handle a, a lot of complex computing.
And at the same point, we have a product that will go low end. And be much more addressable to high volume, low cost environments. So that’s our goal and we’ve, we’ve, you know, started to tie all this together and it really is built off of our, our, our current generation of Nikita and then looking at next generations, going both up into the right and then more flexible to the left.
Abate: And so you have in your background, you have this this robot that, I I’m able to see. Could you talk a little bit about that, and what we’re seeing there.
Rob: Yeah, that, that you know, what I have in my background is a couple of boxes, one, the, the base box down below, which says Akida on it. That’s our, our box that we use to ship our raspberry pi development systems and our, shuttle PC development systems and then the box above that is a smaller form factor.
And that’s got our, our robot, Ken robot, Ken is, is, you know, kind of references and highlights the different sensor modalities that we address. And he’s kind of, it become a little icon and picked up a little momentof his own. And robot can that box is meant for our, our PCI P boards that we’re, we’re shipping and we’re selling as well.
So those are that’s the image, you know, brain ship is really the AKI logo the robot can. And, and that kind of gives you a little color on who we are on the fun side.
Abate: Awesome. Awesome. Rob, thank you so much for talking with us today. It’s been very informative.
Rob: Yeah, I really appreciate your time. Appreciate your questions and, and how you’re approaching things. I will say. as you go through your evaluations with, with, with your company, you know, please do consider brain chip and meta again, you go there by going to brain chip.com/developer, and any questions you have, we’re here for you and any of your listeners.
Always feel free to reach out to us. we’d love to have conversation with you.
——————–transcript——————-
tags: Algorithm AI-Cognition, Business, cx-Business-Finance, cx-Consumer-Household, cx-Industrial-Automation, Person, podcast
Abate De Mey Robotics and Go-To-Market Expert