No More Backdoors: Know Who Has Access to What, Right Now
Jun 13
Register Today
Teleport logoTry For Free
Background image

Unlocking AI Potential: Streamlining Database Access with Teleport

Captivated by the capabilities of OpenAI’s ChatGPT, today, many of our data research friends are experimenting with tools and datasets to learn how Artificial Intelligence (AI) and Generative Pretrained Transformers (GPT) can be used to solve unique and challenging business problems.

While many sample corpora datasets exist, experimenting on your production datasets is often needed but difficult due to access restrictions, challenging network configurations, or complicated approval processes.

In this session, we explore how implementing Teleport can streamline database access workflows for Data Research, Business Intelligence, and Data Engineers. We’ll focus on Teleport’s Database Access component and how our Access Requests feature can be leveraged to achieve the principle of least privilege or just-in-time access. We’ll show you how easily this integrates with existing tools like Jupyter Notebook, Keras, and TensorFlow while having some fun with a sample GPT model to generate text from movie quotes!

In this episode, we will cover:

  • How Teleport Works
  • Database Certificate-Based Authentication
  • Database Role Based Access Controls
  • Access Requests for Role Elevation
  • Developer Friendly Tooling (Jupyter Notebook, Keras, TensorFlow)

Key topics on Streamlining Database Access with Teleport

  • Teleport works with everything you have and provides real zero-trust access and audit for your entire stack.
  • By deploying Teleport, you get a single place to manage all privileges for all layers of the stack, for humans and machines.
  • Role-based access control (RBAC) allows administrators to set up granular access policies for databases connected to Teleport.
  • Teleport lets you specify rules about who can request and/or approve access.
  • Teleport is compatible with developer-friendly tooling such as Jupyter Notebook and Keras/TensorFlow.

Expanding your knowledge on Streamlining Database Access with Teleport

- Teleport Database Access

Learn more about Streamlining Database Access with Teleport

- Teleport Labs
- Contribute on GitHub
- Join our Slack community
- Participate in our discussions
- Why Teleport
- Get started with Teleport
- Teleport Resource Center
- Teleport integrations
- Teleport documentation

Transcript - Unlocking AI Potential: Streamlining Database Access with Teleport


Gus: 00:01:37.296 Okay. It's about time to get going. So hey everybody. Thank you for coming to the webinar today. Title is Unlocking AI Potential: Streamlining Database Access with Teleport. You can probably see my name is Gus Luxton. I'm a specialist solutions engineer here at Teleport. I spend my time talking to prospects and customers of Teleport and thinking about ways that we can help them solve problems and ways that Teleport can work for them in their environments. Today, I'm going to run through a whole bunch of different information about how Teleport can help you access databases in reliable ways and provide you with more information. So I will continue on for now.

Gus: 00:02:17.072 There is just some housekeeping stuff. If you have any questions, please pop them in the chat. Kat and Riley here will be moderating the chat and moving questions around and that sort of thing. There will be some time at the end for that. This presentation is probably going to go for about 20 minutes or so. There's going to be a live demo as well. So don't worry — it's not all just me pointing at slides and looking at things. There's going to be some actual interactive content too. And then we will cover the questions at the end. And if there's not quite time for all of them, don't worry. We always know who you are. We've got your details so we can get in contact with you and make sure you get the answers. And the video and stuff on the slide deck will be shared afterwards after the presentation. So yeah. With that, let's get going.

Gus: 00:02:57.339 So today's agenda. Oh boy, I haven't done this in presenter mode. Okay. So today's agenda. We're going to talk a little bit about how Teleport works, about certificate-based authentication for databases, and why Teleport is different to other services and other traditional PAM solutions in this regard. We're going to talk a little bit about role-based access control for databases, RBAC, and how Teleport can help you enforce permissions on databases. We're going to talk about access requests for role elevation. So you can give people no standing access or a very low level of standing access, and then have them request access to databases on an ad hoc basis. Better for compliance and security and making sure that people don't have access they're not needing to use. We're also going to talk about how Teleport is compatible with developer-friendly tooling. And so we have a graphical client, Teleport Connect. We're going to detail how that can be used to connect services up like Jupyter Notebook, maybe a bit of TensorFlow, and Keras as well. And there will be a live demo of me doing that as well and kind of showing you what that flow looks like in real-time and how simple it is to go ahead with.

How does Teleport work?

Gus: 00:04:00.757 So just to get started as well, Teleport works with everything you have. We have a number of different database integrations in Teleport. As you can see, we support most popular ones. MySQL, Postgres, Mongo, Oracle. We support Microsoft SQL Server. We support Elasticsearch. We support most AWS databases. Aurora, RDS, the MySQL hosted versions of those. Postgres. All those kind of things. RedShift, DynamoDB. A whole lot of other things as well. So we're always adding new database integrations. There's a great big guides page which details how to get going with every one of these and how they work in different regards. So I'll send a link to that documentation at the end of the presentation. This presentation is going to focus specifically on Postgres because that's something I had set up and I was familiar with, but you can do these same access flows of RBAC and everything for all the databases that we support.

Four key pillars of secure access

Gus: 00:04:54.098 So Teleport manages four key pillars of secure access. Just explaining kind of what it does if you're not super familiar. So Teleport handles four different things. It handles authentication. So the concept of who is trying to access resources, whether that's a human user or whether that's a bot service account, some kind of service like that. It supports the concept of authorization. So once you know who's trying to access resources, what are they allowed to do? Are they allowed to connect to a given database? And if they are, what access are they allowed to use? What user can they connect as? What database inside that — the cluster — can they connect to?

Gus: 00:05:29.214 Teleport also handles the concept of connectivity. So you don't need direct access to your resources via Teleport at all. Teleport establishes reverse tunnels between you and Teleport so that you can get into the cluster, and you can have access to resources. And when you connect the resources, they also form their own reverse tunnels. So Teleport kind of sits in the middle as kind of a connection manager for all of these things and connects everything up. You don't need to open ports. You don't need to forward anything. We'll cover a little bit of more information about that later. And Teleport, as the fourth pillar, supports the concept of auditing. So who or what accessed the resource? What resource was it? When did they do it? What did they do? Who granted them permission, or what was the reason they gave? Security teams often want to know these things, compliance, auditing. They all want to be able to provide this information. And Teleport is a great way to get all of that information.

Gus: 00:06:21.489 So how does Teleport work? I'm not going to go into a ton of detail or get super technical about this. If you do have any questions, you can ask them at the end, and I'll try and cover them. But essentially, Teleport is this large purple T in the middle. Teleport enforces access policies, it maintains audit logs, and it acts as the gateway between you and your infrastructure, essentially. So there's two main components that sit here. There's the Auth Service that handles authentication, authorization, auditing, and that kind of thing. Also handles the role-based access control, enforcement of multi-factor authentication, certificate authority, certificate rotation, user lifecycle, all these kind of things. And then you have the Proxy Service, which is the public-facing component that users and resources connect to. If you're running a client, that's connecting to the proxy service, and then that's being connected onto the resources. That uses TLS authentication. Mutual TLS authentication.

Gus: 00:07:10.725 So Teleport issues short-lived certificates to users when they log in and those certificates are then used for authentication. And that also maintains the reverse tunnels between the proxy and the target. So on the bottom, you've got the engineers or the machines, as I mentioned, the robots, connecting to your infrastructure. And then, behind that, you have the infrastructure. So although this presentation and this webinar is focusing on databases, Teleport does handle other services as well. So it also handles SSH. Regular SSH server access. It handles access to Kubernetes clusters. It handles access to web applications, so HTTP, HTTPS-based applications, as well as TCP port forwarding if you want to do that. And it also manages access to Windows desktop machines and Windows servers if you like to do that. All of these go in largely via the same interface and have the same auditing, the same access controls, and so forth applying to them.

Certificate-based authentication for databases

Gus: 00:07:59.726 So let's talk a little bit about certificate-based authentication for databases. So most traditional solutions out there, the way that they work, is that they will store the username and password for a database or a connection string or something similar. They maintain that in a vault. Whether that's actual vaults or whether that's something like 1Password or LastPass or anything. They essentially end up submitting this username and password on behalf of the end user. Now that works very well. It's definitely one way of approaching it. But the issue is that those passwords still need to be rotated, managed, maintained. If one leaks for any reason, you have to go and change it everywhere, and then you have to update everything that wants to use it. It's not as graceful a solution as it could be. Teleport's different to this. Teleport doesn't use usernames and passwords to connect to databases or to any resource inside Teleport for that matter. Teleport uses short-lived certificates throughout.

Gus: 00:08:49.255 So as I mentioned on the previous slide, Teleport issues short-lived certificates to users when they log in. So when you log in, it issues you a short-lived certificate. And it also uses those same certificates to authenticate with the database. So when we look at this diagram here, if we look at the bottom, we have this flow where Joe — we'll, say, use Joe here as the user on the left-hand side - wants to log in. Joe enters his username and password for Teleport. He logs in. Or he uses his SSO provider, single sign-on, if he has that configured. SAML or OIDC. Teleport supports all of these too. And when Joe has successfully authenticated with Teleport itself, Teleport gives him back this short-lived certificate, which encapsulates all of his roles that Joe has, gives him all of his access. And then he can keep that on his laptop.

Gus: 00:09:33.821 Whenever he wants to connect to a resource, he presents that certificate to Teleport. And Teleport decides — okay, Joe is allowed to access this resource. It connects Joe up to the database service, which then allows him to access different databases. So, in this example, we have two databases. One of them is in the development environment, so it has the `env: dev` label set on it, and one is in the staging environment. So it has this `env: staging`. And whenever Joe presents those certificates, the database service will check to make sure that the certificate has the right principles and is allowed to connect, presents it to the database, and the database will validate it too. So the database is trusting that Teleport is signing the certificates that are connecting to it. And with that, you get this end-to-end trust going on between the two.

RBAC for databases

Gus: 00:10:16.369 So you can also do role-based access control for databases in Teleport. So there's three main things that you can do for databases. You can detail the labels that a user should be allowed to connect to. So in my previous diagram here, we have the concept of this being the developer environment database and this being the staging environment. So in the roles, you can say — this role applies to anything with the labels `env: dev` for example. So you can make roles that just apply to development. And then you can say — okay, inside the development environment, any database that's part of that, we should allow this database name for access. So you might have a customer's database, an invoice database — any other dataset that you have. You can specify that in Teleport’s role-based access control as well. And if people try to connect to database names that they don't have access to, Teleport won't allow them to do that. Then you can also specify the usernames or the roles inside the databases that people can use. So you can have shared users, read-only, read-write users, that kind of thing. Teleport will allow people to use them, but then in its audit log, it will also maintain a record of exactly who those people were and what they were doing as those shared principles and those shared users inside the database.

Gus: 00:11:26.081 This lets you build up more complex rules inside Teleport about who can do what. So you can make statements like: "Users always have read/write access to developer environment databases, or users always have read-only access to staging, but they can request a role with read/write access if they need it." Users can't see production at all without raising an access request and having it approved, for example, and any request for read/write access in production requires two approvals. So you can build up these flexible policies based on whatever you like. These are just examples that I've come up with. You can set them to whatever you like. But Teleport can enforce these, and it can make sure that people can't get access to things that you don't want them to have access to without the proper level of approval.

Access requests for role elevation

Gus: 00:12:07.926 This also lets you specify rules about who can request and/or approve access. So here are some examples of Teleport roles but written in English. So anyone who has access to dev databases is also allowed to request read-only or read/write access to production databases. That's just a plain English representation of something you could write in a Teleport role. Or anyone who has this production read-only approval role can approve requests for read-only production access. Sounds logical enough. Anyone who has the production read/write approval role can approve requests for read/write access to production, but any requests for this require two approvals rather than one. So there's a whole number of different ways that you can build these things up.

Compatible with developer-friendly tooling

Gus: 00:12:49.902 So Teleport is compatible with developer-friendly tooling as well. So when you want to connect to resources via Teleport, you've got the concept of — Teleport will open an authenticated tunnel from your laptop to the Teleport proxy, and then the Teleport proxy will forward your access through to the remote database. So this means that you don't have to have direct connectivity to the database. It can be on the other side of the world in a private subnet that you don't directly have access to. And you don't need to connect to a VPN. You don't need to open any ports. You don't need to do any of that. Teleport opens this tunnel for you and handles the authentication and the connectivity for you. So if you use our graphical client that I mentioned, Teleport Connect that I'm going to be using during the demo, this is handled automatically. Where you start up. You connect that tunnel. And any client that you connect to that local host and port on your machine will automatically be connected to the database.

Gus: 00:13:42.201 You can also connect with these with the `tsh` command line tool. `tsh` is for humans. You can use that to log in and get certificates and start tunnels for you to raise access requests, all that kind of thing. You can use the command line if you don't want to use Teleport Connect or the Teleport web interface. You can use the `tsh` command line tool as well. And there's also the `tbot`, which is the machine ID service. `tbot` is the service that can let robots or automated services connect to Teleport databases as well. And you can open as many of these different tunnels as you need to. So you've got five different databases in five different regions. You can have five local ports open, and each one will send you to a different database. You can have them all running concurrently and put traffic over all of them at once. It's absolutely fine. Not a problem at all. So if you've got these workflows, you've got databases in different regions, you need to connect to, just open five different tunnels and you're all in the same place. You don't have to switch VPNs. You don't have to change anything at all. It all just works.

Gus: 00:14:39.501 So whenever you're doing this as well, it's important to note that you don't need to provide a username or password or even your certificates or anything at all to these tunnels because they're authenticated tunnels. As long as you're the only person with access to your laptop, you connect to that, and Teleport will automatically handle the authentication for you. That means that your connection string to your database looks something like this. It's literally just the username and the host and the port and the database name. There's no username, no passwords, no complicated certificate paths, no SSL settings, anything like that. It just works automatically. And Teleport handles all of the authentication for you and forwards it all over TLS. This makes these sorts of tunnels a great fit for tooling like SQLAlchemy, for example, if you're running a Jupyter Notebook and you want to make some SQL connections from inside Python. It also means you can use things like TensorFlow I/O to read datasets from SQL databases as well. So whenever you do this, the other thing is that because these authenticator tunnels are going via the Teleport proxy, all the connections that you make and all the SQL statements you run, they all appear in Teleport's audit log. So it tracks the real username. Joe in our example. It knows that Joe is the person running the query. It knows the time that he ran the query. It knows what database he connected to, what database user he used, and access request identities that he was using if he made an access request to access that database too. So a phenomenal amount of information is available for auditing purposes if that's what you need.

Demo time

Gus: 00:16:06.481 Okay. It's now time for the demo. So I'm going to reshare my screen, and I'm going to show you something more involved. So let me find the right button and we will share the entire screen here. Okay. So I'm going to start up a connection flow with Teleport Connect. So I'll make this window a little bit bigger. On the right-hand side, I've got my administrator session for my cluster. But what I'm going to do is I'm going to log in as a regular, non-privileged user, just a standard developer user, and I'm going to request access to a resource. So I'm going to connect to my cluster. My cluster is I'm going to start that, and I'm going to log in using Okta here as my credentials. So when I click on this, Okta's going to pop up and ask me to log in. I'm going to use my [email protected] account here, which is my developer-level account. I'm going to fill in the password. I'm going to provide my — going to attach my YubiKey here on my laptop for second factor. And I'm going to be told, "Login successful." So when I switch back here to Teleport Connect, I'm now logged in and I can see the resources. I can see that I'm logged in as [email protected]. And I can see the connections I have.

Gus: 00:17:15.533 If I go to databases, I can see I don't actually have any access to any databases at the moment because I haven't raised an access request yet. So what I'm going to do is I'm going to raise an access request here via the menu. Go to the menu and say, "New access request." I'm going to bring this up and say, "Right, I want to request this data science database's role here." I'm going to add it to my request, and I'm going to proceed through. I can put in a reason why I need this access. So quite simply, I need access to the databases. I'm going to submit that request. It's going to say, "Well done. You've submitted that." And now it's ready for an administrator to approve. So there are open source plugins you can put with this as well. So when an access request is raised, you can get Slack notifications, Mattermost notifications, Teams notifications, Discord notifications, email notifications. You can raise a Jira ticket. You can submit a flow via PagerDuty. There's a whole bunch of different plugins. They're all open source. You can fork them and write your own. There's a fairly simple API for doing it. So you can integrate with whatever you have to let people know that there are access requests available. You can also, via the API, build flows to automatically approve access requests in certain conditions and things like that. So the sky's kind of the limit with what you want to do with this.

Gus: 00:18:24.629 In my example, what I'm going to do is — in my right-hand window here — I actually have an administrator login. `webvictim` here is my administrator. I'm going to go to access requests as the administrator and have a look. And what I can see is that there is an access request that's been raised here. Somebody needs access. I can click on view. This is going to say, "[email protected] submitted the request. I need access to the databases." They're requesting the data science database's role. And I can say, "Yeah. That's fine. You can have this — you can have this access," and submit the review. This request's now been approved. The Slack channel would have been updated if there was a plugin running for that.

Gus: 00:19:00.432 And now as this user, if I go back to the listing, I can see that actually my request has been approved. So I can now assume that role because it's been approved, and I can get this access. I've now got that access for the next eight hours. That's something you can set on a role-by-role basis as well, how long should people have access for. Eight hours is the default here, and that's fine for me to have. So now when I go back here, and I click on databases, we can see this new database has appeared. So now I've got this Postgres database that I didn't have access to before, which is going to allow me to connect onto things. So I can click on connect, and it's going to give me a list of the users that I'm allowed to use on that database. I've only got access to the Postgres user here. Simple enough. So I click on that. And that's going to establish that tunnel — local tunnel that I mentioned — to connect me through. I'm going to change the port here because I want this to run on a specific port. And I'm going to say, "I want to connect to this DVD rental database," which is something I have. If I click on run here, this is going to start a local Postgres client. But actually, I can just connect to this port now, and that's going to get me access through to the database.

Gus: 00:20:05.178 So what I'll do, go back over to my browser here, and I'm going to start up a session in a Jupyter Notebook. So in Jupyter Notebook here, I can say, "Okay. I need access to the database." So I'm going to put in my connection string `postgres@localhost55432`. The port that I had on the DVD rental database. I'm going to replace Postgres here with PostgreSQL because that's what SQLAlchemy needs, and I'm going to run this very simple SQL query. `SELECT * FROM actor LIMIT 25;` So when I run this, it's going to connect me through to the database. And I can now see that I've got 25 results come back, just as I requested in my query. If I, as the administrator, now go and look at the audit logs for the cluster, what we can see is [email protected] connected to the database and then ran this query. So we can see that, although as the user I ran this, my administrator user has access to see exactly what I ran. And when we look at the database audit log, we can see the query here, `SELECT * FROM actor LIMIT 25`. So any query that the user runs appears in the audit log and the administrators can see what it is. It also gives you a lot of other information, like who was the user, [email protected]. It tells you the user identity. It tells you the time they ran the query.

Gus: 00:21:19.841 It also tells you the access request ID. So this was the UUID of the access request that I raised and that was then approved. So now I can go back and link this through with the audit log and say, "If I go and search for this access request in the audit log, we can actually see access request was created and is pending, and then `webvictim` reviewed it.” We can look at the review, and we can say, "Oh, this access request that was used for this database query, it was approved by `webvictim` at this time, and here was the message they put along with it." And you can look at the original access request to say, "What was the reason this was raised?" "Oh, I need access to the databases." So you can see all of this information. Everything is logged through as a logical chain. And you can export these through to event management or anything similar to have access and to be able to see that. So this is a very simple query. I can also do another thing where I say, “SELECT * FROM actor WHERE first name is Kevin”. I can use parameters here and variables. It's going to give me all the results where the name is Kevin. And of course, if I go back here, and I refresh the audit log, we can see the `SELECT * FROM actor WHERE first_name`` Kevin goes into the audit log.

Gus: 00:22:25.981 So let's try something a little bit more advanced, maybe. This is a fairly simple SQL use case. Let's try something just a touch more advanced. So I mentioned the concept of using TensorFlow I/O. So what I'm going to do is pull in a little bit of air quality data. I've got another database in my same query here, so I'm going to try and pull through a little bit of air quality data. If I run this query, this is essentially going to run TensorFlow. It's going to connect to the database, and it's going to pull out a couple of tensors. So I've got some carbon monoxide data here, and I've got some air quality data. This has given me a couple of tensors that I can then use to build other queries. And we can see as well that same query has come through and is appearing in the audit log.

Gus: 00:23:05.142 So say I want to do a simple kind of query here, where I just select these, do a query to calculate nitrous oxide, take the first records. This is going to run TensorFlow. It's going to pull that information and do me a simple query. This isn't particularly advanced, but what it does show is that you can just pull this data. Run queries on it via a familiar developer-friendly tool. So I've just got JupyterLab running here on my local laptop, and I've got a port forward of that to the database. So I don't need access to anything else. I can do all of these things locally, but Teleport is providing me the connectivity to the database on the other side of the world and giving me auditing and information to go along with that as well. I tried to build a more advanced flow here with Keras trying to build these things. The point is, if I run this, it's not actually going to work because I'm not a data scientist. I'm not an expert on this by any means, and I can't figure out how to make this work. I tried for a very long time.

Gus: 00:23:56.970 However, what I can show you is that this query here, the `SELECT * WHERE` — it's pulling out the data values that I would be using to build my layers. What that is going to do is it's going to show me the query appearing in the audit log. And I can see that I'm running this query and all of the data is there and all of the information is available. So Teleport's doing its job. It's just that I don't know what I'm doing, so I can't build the machine learning flow here. But with some more time and some more knowledge, I'm quite sure that I could. So anyway. That's the whole point of what Teleport's doing. And that's kind of the information to go along with this. So that was it for the live demo. I'm just going to reshare my screen quickly to give back the automatic — just to give back the presentation mode.

Next steps

Gus: 00:24:39.108 So next steps here. Things that you can do. We have a platform called Teleport Team, which is a self — which is our hosted SaaS platform. It gives you access to all of the Teleport functionality on a 14-day basis. It also contains Teleport Assist, which is our AI-based GPT tooling. It can help you write command line commands for system administration and automatically run them on your servers if desired. So you can say, "Show me how much disk space is free on all of my servers," and it'll give you a command to do that and then go and run it for you and give you back the result. If you go to, you can sign up for a trial. You can also download the open-source version and run it for free if you want. Run it on your own infrastructure. But the hosted solution is much easier. And you can read a bit more about Teleport Database Access. What is it that we — what is it that we do, what databases do we support, and how do you set them all up. There's a couple of links there. And I know that Kat is going to share those as well.


Gus: 00:25:34.527 So that's largely it for my part of the presentation. I will go through, and I'm going to take a look at some questions and answers and try and do those as well. But just wanted to say before I start doing that, thank you very much for coming. I appreciate that. I'll leave the next steps links on screen, actually so that you can read those as well. So we have a couple of — we have a couple of questions here. So a question from Sadiq saying, can you use ABAC with Teleport too? So attribute-based access control. Theoretically, yes, you can. It's more of a question of kind of how you build it. But if you're talking about kind of attributes set on users or things like that, you can build — essentially, you could pull information through. So if you sign into Teleport with a single sign-on identity, for example, you can pull attributes through from users. So set in your SSO provider, you can have a list of databases that a user should have access to. Teleport can pull that information through and use it to automatically build roles on your behalf. We have some role-templating functionality that allows you to do this sort of thing. So those kind of things you might be able to do.

Gus: 00:26:43.774 From the database side, it's more a case of you can build Teleport roles which target everything based on tags. So, for example, if you're using RDS-based databases in AWS, you can pull all the RDS tags through into Teleport. And you can have roles which target that. So you can say, "This role should apply to anyone who has any instance which has this tag set on it," for example. I appreciate that's perhaps not quite what you're asking. If you'd like a more specific answer, you can always — the other thing you can do, if you go to Teleport on GitHub — If you raise a discussion post there with exactly what your query is, I can definitely reply to you and make sure that these things — make sure that these things get answered in more specifics. It's difficult to get the exact kind of details of what you're asking for.

Gus: 00:27:30.733 And the other question as well, was how does this work for managing database connections from other servers? For example, Fargate containers connecting to an Aurora cluster. So there's a couple of different ways you could do this. One of the things that we — I mentioned the concept of Machine ID and kind of robot-based access. One of the things that we see people doing is they will look to deploy Machine ID as kind of a sidecar. So, for example, if you've got a Fargate deployment, they would deploy Machine ID into that Fargate cluster and have it — what machine ID does, essentially, is it connects to a cluster, and it gets very short-lived certificates that are valid for, say, 20 minutes or an hour at a time. And it just writes those constantly on a loop. So essentially, it writes those certificates to disk and refreshes them every 20 minutes. If you have some process that picks those certificates up and puts them into Kubernetes Secrets storage, for example, or AWS Secrets Manager or Vault or whatever else you want to use, if you have that configured, you can have workloads like other workloads in Fargate, pick up that secret and then use that to connect through. So essentially, those short-lived certificates for a robot, you can have services pick those up and use them for their connectivity as well. That's the sort of flow you can do.

Gus: 00:28:44.572 You can run that Machine ID, like I say, as a sidecar. You can run it as a completely separate workload. And you can have that kind of — `tbot` has this proxy functionality as well. So you can have machine ID generate the short-lived certificates and also open a local listener inside that sidecar, which then your main workload can connect to through the sidecar off the Teleport going through the audit log and so forth. So it's definitely possible to do it that way if you want, and we have people who are kind of building those flows. The Machine ID flow for doing that in containers is still a little bit nascent in terms of what it's doing. It's still being worked on. But people have built these flows and they are kind of working like that. So it's definitely something that's possible to do.

Gus: 00:29:26.605 We have a question from David here that says, what level of granularity can you set Database Access on? For instance, can you set table and/or column-level access? So the answer from a Teleport level is no. Teleport doesn't enforce kind of column-level access itself. Within its roles, it doesn't have the concept of column-level access available. What you can do, though, and what we kind of propose with this, is the idea that you configure a sort of known set of principles in the database. So you have some shared users, essentially. If you want a kind of anonymized viewer role where, for example, they can pull out the names of customers, but they can't pull out their — or they can pull out identities, but not names, for example, or things like that, you can build those permission models at the database level itself and then the data — and then Teleport can enforce access to those roles for you and allow you to do that.

Gus: 00:30:21.729 In the most recent versions of Teleport, we've also added functionality where Teleport can automatically create database users for you as well. So essentially, if you don't have an existing database user, Teleport can create it for you. And you can kind of decide what role that should inherit from. Whether they should be an administrator, whether they should inherit some other role. You can build those kind of flows out as well. So while you can't do column-level access, you kind of can enforce at the database level — this role or this user is only allowed to access this given table. You can do that in Postgres just by creating a user and limiting the permissions there. And then via Teleport, you can say, "Okay. You're only allowed to use this limited user unless you request access to the more privileged one that can see all of the columns," for example, or something like that. ### Closing words

Gus: 00:31:08.741 I think that was everything that we had in the questions at the moment. Like I mentioned, if there is anything else, I didn't put the link here, but if you find Teleport on, you can go there, look at the code. We've got very active issues, discussion forums, this kind of thing. There's also the Teleport Community Slack as well, which you can go to. Thanks, Kat, for putting the link there as well. You can go to the Teleport Community Slack as well and sign up there. Lots of Teleport staff are there. People can help, people can answer questions, all those kinds of things as well. There's a link at the bottom of the Teleport main website that says, "Go to Slack." And you can sign up there, and we'll send you an invite as well.

Gus: 00:31:48.364 So yeah, thank you so much for coming along and watching this. We're just about on time as well. That's a miracle. So yeah, thank you so much for coming. I really appreciate this. We will send out the recording, and we will send out the slide deck. And like I say, if you've got any other questions, come find us. And give it a try. Teleport is free. You can deploy all of this for free. You can use it with local users. You can sign up for the free trial of Teleport Team online. You can connect a database to it. We've got workflows telling you how to do it from start to finish. Give it a go. It wouldn't take you that long to build something like this, and it's very impressive when you do it. So give it a go. And yeah, absolutely. Thanks very much for coming along, and I'll see you around.


Join The Teleport Community

Background image

Try Teleport today

In the cloud, self-hosted, or open source
Get StartedView developer docs