Transforming Privileged Access: A Dialogue on Secretless, Zero Trust Architecture
Watch the Replay
Teleport logoTry For Free
Background image

Enabling Compliance for Database Access - overview

Enterprise databases hold an organization's most sensitive information and need to be protected. Beyond that, organizations must also demonstrate compliance with frameworks like FedRAMP, HIPAA, SOC 2, GDPR and more for these databases. Complying with these frameworks without slowing down DBA teams is a challenge. This webinar demonstrates how to unify access controls for connectivity, authentication, authorization, and audit for popular OSS databases Postgres, MySQL and MongoDB so you can move fast but stay secure.

Key topics on Enabling Compliance for Database Access

  • Teleport Access Platform is an identity-aware Access Platform as well as certificate authority.
  • A typical access pattern is using a bastion style of access where the databases are running, often in private DNS or very restricted access.
  • Anti-patterns pose risk and include using the same username and password for users, database, and applications.
  • Customers transitioning to Teleport often consider aspects like user population types, compliance, the number of people who will authenticate, and other factors.
  • Teleport can provide secure access to PostgreSQL, MySQL and MongoDB databases, while improving both access control and visibility.

Expanding your knowledge on Enabling Compliance for Database Access

Introduction - Enabling Compliance for Database Access

(The transcript of the session)

Stephen: 00:00:02.181 Welcome to today's webcast, brought to you by Teleport. I'm Stephen Faig, director at Database Trends and Applications and Unisphere Research. I will be your host for today's broadcast. Our presentation today is titled Enabling Compliance for Database Access. Before we begin, I want to explain how you can be a part of this broadcast. There will be a question and answer session. If you have a question during the presentation, just type it into the question box provided and click on the submit button. We'll try to get as many questions as possible, but if your question has not been answered during the show, you will receive an email response. Plus, all viewers today will be entered for a chance to win a $100 Amazon gift card just for participating. Now to introduce our speaker for today, we have Steven Martin, Solutions Engineer Manager at Teleport. Now I'm going to pass the event over to Steven.

Steven: 00:00:56.800 Thank Stephen. A great name, by the way. I appreciate being here. And today I'll be stepping you through how you can unify your access controls for open-source software databases, so databases like Postgres, MongoDB, and MySQL. And let's step through what our agenda will be. First, I'll be going over what is the Teleport Access Platform. It's an identity-aware Access Platform that is also a certificate authority. It's also an open-source software offering, so that's something you can use free of charge. Or we have an Enterprise offering. We'll be going through specifically looking at the database access and audit area in regard to OSS. And while going through that, we'll be looking at where do people usually start out in terms of managing their database access and some issues with that, as well as looking at particular antipatterns in database access, why using those practices can lead to issues, and then how you convert into an Access Platform approach to make it better. We'll be going in-depth into how does Teleport provide database access, its use of agents, the certificates, how users access from their desktops into databases using the same database clients they're already using, and then just give you some ideas of how that could fit into your environment.

Steven: 00:02:32.522 We'll also talk about, "Well, how can you transition to that?" Because it's one thing to say that this can fit into your environment or it does have a good feel, but okay, what is your realistic way to convert myself in that environment? And we'll be stepping through that in detail. We'll also go through a demonstration in addition to the discussion here, I'll show the various access ways you can in Teleport [inaudible] to multiple databases, as well as how Teleport can allow your users to follow a zero-trust approach for privileged access in addition to their default access roles. And lastly, one of the best parts is going through question and answers. So as you go through, we'd like to have your questions submitted, and we want to address those after our presentation.

Overview of Teleport Access Platform

Steven: 00:03:29.043 Okay. Well, first, let's look at the Teleport Access Platform. So as I mentioned, it's an identity-aware Access Platform, as well as the certificate authority. So part of the key to Teleport is that it is certificate-based. So we issue short-lived certificates to users. They use those in terms of their web or desktop access when working with resources. Also, certificates are used in terms of the resources we're connecting to. So Teleport is a Go-based solution. And if you see there is a Teleport Proxy, Teleport Auth, also the resources in the bottom, SSH, Windows, MySQL, all of those are resources that we're enabling access to. And an agent, that same Teleport binary is being used to facilitate access. The agent doesn't need to run on the same machine as the particular resource, it simply needs to be able to connect to it, connect to that resource, and enable it. And the Teleport server, again, it's an open-source offering. We also offer a hosted service version of it so it can run on your own environment, completely private DNS, or some folks run it on Kubernetes clusters, AWS, they run it in Azure. It's really easy to run. Our CEO often likes to run it on Raspberry Pi, so it can run from very small to very large installations with highly available configurations.

Steven: 00:05:01.310 Now, typically, a user is coming in, you see that developer, could be an SRE type. They would authenticate through the web console. They go through the proxy. Proxy works with the Auth that allows you to authenticate through an SSO. So something like an Auth0, an Octa, and GitHub. You can authenticate to that using the metadata coming back that claims your traits from your SSO, will map to your particular roles. Teleport is a role-based access control system, so a user has one or more roles for whatever they're accessing. That also holds true in terms of accessing. If you're running service users, you can issue a service user account that again uses roles, and allows you to interact with Teleport and the resources. Now, once a user has been issued a certificate, that certificate will have a certain time limit, which we recommend being short-lived, something like 10 hours or 2 days. And then once that expires, the user would then reauthenticate and get the latest updates.

Steven: 00:06:03.650 Also, as you can see at the top, we have access request integration, so Teleport can allow users to — and I'll show this as part of my demonstration. Users can request additional roles that give them access to let's say prod or just a different set of access on the same resource that they have by default, perhaps getting the root user or some other privileged user. They can then access that and access the resource in a different way. That's still short-lived again. So that access request would only apply for that first time period, let's say an hour, whatever period of time is allowed by those different roles, and then revert back. And also, at any point there, you can lock or either permanently or temporarily do that to people's access and even on nodes. So if there's any concern about access breaching or maintenance, it's very simple to stop access in real-time.

Typical access patterns

Steven: 00:07:09.493 Now, in terms of what we often see. And this holds true for many types of resources, not just databases, but often folks are using a bastion style of access where the databases are running, often in private DNS or very restricted access, even if they have a public IP. And typically, users cannot directly access those databases. Often there is a bastion server like an EC2 node, a VM, bare-metal that users, they are allowed to access that from their desktops or web consoles, and then they either authenticate then again through the bastion or through the bastion to the database. Now, that's not an uncommon pattern. It allows you to expose a database through an authenticated name. But there are some concerns with that approach, namely that this can often be a single point of failure. That you're setting up a server, often there isn't a — because there isn't really a set method for bastion. Everyone kind of often rolls their own. That bastion is maintained, it's often ad hoc maintenance and operation, so it's periodically updated, installed. It's often done manually. And that can lead to what if that server has to go down or go for maintenance? It can be a long period of outage where people cannot get to the resources that they need to.

Steven: 00:08:45.487 Now, as well, often the credentials for the user to connect to that bastion server and then through to the database are non-expiry. So whether they're password-based or they're using SSH keys to get to the bastion, that can lead to issues of, "Okay. Well, where are those keys? How are they locked down and revoked?" And that's always a concern. There's also the issue of auditing. Excuse me. So who is checking who's connecting through that bastion? Who's connecting to that database? Excuse me one second. Sorry, just had a little bit of a cough here. In terms of also onboarding or off-boarding people, that can also be difficult, especially when there is no common single sign-on with that particular server. You may be issuing static keys or username passwords. So if that's a manual operation, that will tend to slow you down. And the harder part is not just onboarding but off-boarding. So how do I get rid of the user's credentials? If they all had shared passwords, how do I make sure that they aren't continuing to use that? There's also the concern of if those are stored on their particular laptops. They may be personal laptops. How are you making sure that those people aren't able to access from a later date? And that's always a major concern.

Steven: 00:10:15.313 The other is that in the one case you saw, we were going from one bastion server to a set of databases in one network. That's pretty easy. What happens when you start adding other networks? You keep opening up connections from that particular bastion into other networks? You might have to start up new bastion hosts or new things. So it becomes harder and harder to keep adding on all of these different multiple networks, especially if they're different VPCs if they're different clouds. And that can become quite challenging, especially in the server that you're starting from bare-bones, and you're having to maintain yourself.

Anti-patterns in DB access

Steven: 00:11:00.601 Now, I wanted to go over some concepts of anti-patterns, and that is ways of working that can hurt you later by following these patterns of work. And often that's things like using the same username password for users, database, and applications. So you might have a particular application, whether it's a Python or a Node.js application is connecting to a database, a Postgres, or MySQL. Now, that same username and password is being used by other individuals. And we'd say that one, we recommend using certificate-based authentication, whether it's for an application or for a user, and also that those should be separated. The same username should not be used by an application as used by a DBA and external user. In that, it can hard to track who's using it. The more it's widely used, the harder it is to change because it becomes disruptive to all those sets of users. And in some cases, it becomes even harder to know who's accessing it from where using that user.

Steven: 00:12:13.927 Now, the next is part of the zero-trust approach of, "If I give you access, am I giving you too much access? So if I give you access to a database, do you really need to have the ability to [inaudible]? Do you need the ability to radically change things in it?" So for example, if I'm accessing a particular database and someone unknowingly to me gets access to my user, well, they might be able to insert a function into my database. They might be able to make it so that every time a particular insert is done, a function is called and that data is sent somewhere else. So there's things you could do without realizing it that by having more rights than you really need, that exposes your information. So often it's best to make sure that your grants are leaner and really targeted at what the person does day-to-day, as opposed to giving them full admin rights over the application database.

Steven: 00:13:17.399 And then next, we mentioned a couple of times the idea of non-expiring passwords or static credentials. So that if you're not rotating your passwords, you're not using certificate-based with time-based credentials, it becomes very difficult to maintain. That often requires using a vault-like approach or requiring that, "Okay, no one else is using the password, right?" And manually checking with individuals, who's using it, who's not using it. Concerns about devices being lost, trying to use device attestation, which can be difficult to enforce. So this is something where using non-expiring can be difficult, and one of the things we recommend is using multifactor along with passwords. So when people authenticate, it's not just using passwords, but using multifactor, things like YubiKeys or other web auth kind of authentication. One of the other things, even if you use certificates or other things, there's always the concern of having a single administrator, of having only one individual that can actually connect and maintain your set of resources. And if there's no way that if this person can't access their system, can no one else be enabled it. So essentially, is there no way to restore access unless this person has access all the time. And in today's world, it's very difficult to make sure that people will still be with the same organization. What if something catastrophic happens to their hardware, they're unavailable for a certain amount of time. Now that's a real danger to the database, the resources, and even potentially disruptive to your work.

Steven: 00:15:12.598 So we'd say there should always be a way of getting access via a means besides just one single person. The other is whether or not you have multiple administrators, but if you're maintaining backups, are those backups deletable via DBA? Are they able to control those backups so that if someone does get access to your database, they are able to corrupt certain data, do you have backups that are immutable? They're not able to be changed by those same people that are starting them. So it's just the idea of, again, with that zero-trust approach of not just giving full access, giving regular access to backups, but being able to know, "Okay, what are we doing? Restoring it. If I needed to restore it, I know that this would not have been altered by someone if their access was compromised." And then there is that — do you have a privileged or escalation process where it's if someone has developer access, "Okay. Well, if I need to give them access to prod, what is my approach? Is it written down? Is there an automation process that can give temporary access to someone and then revoke it once their time is expired?" Without that, you can have different cases where you're giving them too much access, you're giving them the wrong access, they're getting into the wrong system. And particularly if you have a tiered approach of Dev, Test, Staging, Prod or different types of departments and access, it's really good to have an approach where you can temporarily give that access and you can make sure to roll back that access in a timely manner.

Teleport Database Access

Steven: 00:17:06.290 Now, in terms of the Teleport Database Access, and how it addresses those kind of concerns, one of the first things, again, is that Teleport is certificate-based. So as you can see on the left, when you're interacting, whether that's from the KeySQL or MySQL type of database clients, they're using certificates. So an individual would have authenticated into Teleport, they'd issue their certificate, they're then connecting through Teleport to the Teleport database service. Service will only allow for certificates that have been signed by the Teleport Auth. So they would have authenticated. As you can see here, Joe had a password, had a second factor where they authenticated themselves and were issued that certificate. So the database service will confirm that user certificate and then also with the particular role configuration, what tag databases can they see, what usernames are allowed? So provided that all fits, they would then be able to connect to the various databases, whether that's self-hosted Postgres, cloud-based services, and other databases.

Steven: 00:18:19.009 And as you're going through the anti-patterns and issues, this gets away from a piece of software that's handled manually. It's run just by your organization. This is using an open-source software which we consider the best way to do this kind of security. It's annually audited by Doyensec in terms of making sure that there aren't exposures that could happen with the service. It's using full encryption. It's also able to be highly available so that let's say if the Teleport database service comes down, you can have multiple agents running that all connect to the same databases. You can have it configured so that nothing's written to the disk. So a high level of security. It's not storing any passwords to these databases. It's using certificate or IAM credentials to connect. So that way, you can have that high level of security without bringing about leftover static keys or vaulted passwords, rotation, things of that nature. We consider this a very secure manner of how to do it. And again, using our open-source approach, we think is the best manner to confirming that we're using a highly secure pattern.

Transitioning to Teleport Access Platform

Steven: 00:19:40.083 Now, in terms of how would you transition to something like this? We go through this with a number of our customers and also working with our open-source users. And often the best way is to start with your set of use cases. What are the set of user population types you have? Is it a large number of data analysts, SRE types like your monitoring, do you have escalation flows, so it's really collecting those use cases and seeing what are our needs. You might have a certain type of compliance you're trying to reach, whether it's SOC 2 or other types of compliance. And often, it's collecting those. Organizations like ourselves can also work with you to expand those out. Another part of it is your single sign-ons. So how are people going to authenticate? Do you want to use local users? That's fine. Do you also want to use potentially a GitHub, which is part of our open-source offering? Or in terms of our Enterprise, we also support OIDC in the same way. Often people also want to validate broken class scenarios. As I mentioned, what if an agent goes down? How do we make sure that we still have access? So when you're using these tools that there are ways that if there's any kind of disruption — and we've all seen in the last number of weeks disruption in certain cloud environments — making sure that there are ways to continue the business as well as being able to access your infrastructure.

Steven: 00:21:12.133 Now, in terms of Teleport, you can always try out the open-source offering that's available directly from our goteleport.com website. It's easily installable directly to a regular Linux VM as well as hostable on Kubernetes, Docker, any number of environments. In terms of our commercial offering, there's our cloud where you can run our hosted service. We have two-week trials there, and there's always our Enterprise discussion with folks who want to have a supported version with certain Enterprise features. Now, in terms of the actual rollout, typically, a phased approach where you'd want to confirm that set of criteria, and in terms of the use cases. Next is terms of how are you hosting it? Are you hosting yourself? Are you hosting it in terms of our environment? And that's where we confirm with you about, "Okay, what are you accessing? What type of roles do you want?" Most people end up wanting to use our Terraform provider configuration, so you can deploy Teleport in Terraform, in AWS. And then you can also manage the configuration of Teleport, roles, auth connectors to GitHub and other single sign-ons through that Terraform provider. And often people find that a very convenient way of maintaining the particular configuration and seeing that there aren't any changes that they apply, and they know whether anything has changed.

Steven: 00:22:49.483 The other important part of that, and you could even consider this as part of your use cases is who's going to maintain Teleport in terms of the operation of it. It's not typically a full-time endeavor in terms of Teleport, but you would want to know, "Okay, who's going to be responsible for these roles?" In terms of maintaining things like roles, the connectors, reviewing the audit log, and then taking in the periodic updates that happen with a product like this. We are regularly releasing major and minor versions which with the hosted service, that will automatically be upgraded for you, and then you can also take it on yourself. It's a simple process of replacing the binary, so it's not a very difficult endeavor. It's just that you'd want to make sure to know who's going to operate that. We also recommend using our access requests in terms of making changes in Teleport so that that way people are confirming if they make a change just, again, because whenever you deal with access management, you want to make sure that people are not inadvertently giving themselves too much access, and you are able to control that.

Steven: 00:24:09.894 Now, in terms of stepping a little deeper in how Teleport works for database access and the different types of combinations you can have, you can see on the left, I have a user, they've been issued a certificate within Teleport, and they connect through the proxy, and they can connect over a single port, which is one of the modes we support. We also have the case of multiport support where you connect through the HDS as well as things like a MySQL port. And one of the things about Teleport is it's very flexible in terms of how you want to manage your connections. You can have a public internet address, you can have a private DNS address, you can also have multiple addresses. So if you want to have your agents connecting back to a different address, you can have that different from user. Now when a user is connecting using a database client, they would connect through the proxy. They would then connect to things like the Teleport agent you can see on the lower right. That agent may be connecting to a Cockroach Database to different Postgres servers. But when the user itself is — they are not directly connecting to the agent or the resource itself. That's being done by the agent. So the agent is living in a place where it can connect to those resources.

Steven: 00:25:36.202 So I mentioned about multi-networks and one of the things is that the way it connects back, you can see that connecting back to the proxy, is using a reverse tunnel. So it does not require opening a port on itself, rather it will dial and connect to the proxy, and then enabling reverse tunnel. That allows the proxy to facilitate communication through the reverse tunnel and not have to directly call into the proxy. Now, in addition to running on a straight VM, you also have the option of running the agent in a Kubernetes cluster or also through another Teleport. So in the case of if I have a single Teleport cluster, let's say it's in the US East region which is where I'm located, I might have another Teleport cluster located in west which has a number of agents connecting to it. I can have that west leaf cluster trust to my root cluster. You can see that on the upper right here where I have a Teleport cluster deployed on the Kubernetes cluster that is connecting to MongoDBs, that then trust backs through to my root cluster. And there's a couple of reasons to do that. One is you may not want all your agents connecting across networks, the other is just logical separation. So using that role-based access, I also have the ability to say, "Well, you cannot actually connect into this cluster unless your role allows it." So besides just resource-based roles access, you can even restrict connecting to other clusters. And we find that's a good way of completely limiting auth. Let's say if you had a prod cluster or you had another customer cluster, but you want to be able to connect them all together to a central root cluster — we do offer you that control.

Demo of DB access and audit in Teleport

Steven: 00:27:32.268 Okay. Now in terms of stepping through a demonstration, I want to show you some of the capabilities of connecting to MySQL, MongoDB, or Postgres. Just the experience that a person would have. And also what are the auditing features that are happening while a person is doing that. Also things like in terms of tracking the logs, how they're exportable, and how the queries themselves could be made available within a SIM-type system. So we'll be stepping through that. Now, we'll also show how initially a developer may have a certain type of access, and how they can use their existing database clients to use that and then how they could actually see other environments through their escalation process. All right, so let's go ahead and go into it. So here, [crosstalk]--

Stephen: 00:28:52.898 Looking good, Steven.

Steven: 00:28:55.055 All right. Thank you. I am in a PopSQL client. So this is a popular tool that's used to access databases, in this case, a Postgres database. And here, you can see I have a number of tables available to me. User can click and see those roles. So in this case, not much different than a regular day-to-day access. But let's take a look at what the system recorded as I access that. So you can see here I have a user, Bob. I'm in here as Steven and Steven is my admin user. I'm able to monitor, and I can even do some changes to the environment if I wish. While Bob is accessing it, I can see that they've connected here. I can see what database they're connected to, the service name, the name there that they're allowed to as well as that user, Bob. So whatever they're connecting as new user, well, I know it was still Bob who connected. So that gives me that audibility to know who are they connecting as and where. Now, in terms of how Bob got that particular access, Bob initially logged in through the desktop into this proxy. So he's going to have a proxy or Enterprise, and he's logged in as this user. He has two roles, Access apps databases and access user. And then they log in for SSH as well as databases. So when I want to look at the databases available, I can list them. You can see I have some other databases available as well. I'm logged into two of them. I can also see how long I have. So right now, my certificate gives me 10 hours and 39 minutes more that I can access as the certificate. And once that time has expired, I'll then have to switch back over, I'll have to reauthenticate, and then I'd be issued an updated certificate.

Steven: 00:31:07.350 Now, in terms of what queries are being done and additional monitoring. So right now, you saw me within our auto logs. Our auto log allows you to see things like when a user is logged in, when they're running their queries, number of activity. But there's also the case that this could be a large number of activities. You can see I already have several hundred here and often you would want to link that into a particular SIM or analytic service. In this case, I'm using Elastic. So this is one of the features of Teleport — exposing web applications. In this case, I'm doing it through an ELK Stack. And actually, through this ELK Stack, I can get the real-time data queries that are running. You can see this is the query that Bob ran. So through a [Fluent] integration, I'm feeding in the event log that is occurring, and I can use this to in real-time monitor what queries my users are doing. So this will allow me to both monitor what they're doing, which is the best practice, it's good to review what users are doing, which in some cases may even have large costs if they're running too many expensive queries. This is also just monitoring how many databases are we accessing? Are we allowing too much access or people are running the issues of not having the right types of access? That's an example of being able to quickly integrate from Teleport into there. I also could have used our Fluentd integration that allows you to easily connect to things like Splunk or Datadog from Teleport.

Steven: 00:32:49.779 Now in terms of Bob's access, you can see an example of a single sign-on here. So I'm logging as Bob into his web console. And in addition to — right now, he can see these dev databases, but let's say he does need to get some more privileged access. That's where we can use the access requests. So part of this — Bob can go into Access Requests, submit a request here, and access prod, saying that he needs to check a deployment. So that requests, and then that becomes a pending request. Now from here, this can be sent to one of those plugins we mentioned which allow you to either be auto-approved or require someone confirm. And in this case, we're using Slack, which has the posted on a private channel requests which lists, "Okay, well, what is the ID, cluster, the user, the role they're asking for, why?" So giving that reason. Then we have a link here that takes you back as well as the status of that particular request. Now, in this case, Bob can continue to see that request. We go back to our user that has access to review them, and as part of what you can separate out is that the person who actually approves it, they don't need access to the resource. They just need their role configured to allow them to review the specific access request. In this case, access prod.

Steven: 00:34:35.938 So Steven would come in — and you can also set certain thresholds. That's one of our features that you can say, "Well, I need two or more people to approve that kind of access." And then we simply submit, and it's approved. Now that will update things like Slack. So a person would see, okay, it has been approved. Someone else in this channel then would not need to go in and approve themselves. And then Bob will be pretty happy because he's gotten his approval. We check and see that it's approved. Now, in order for him to access that, from the desktop, he can take a look at — okay, let's see our requests. I can now take on that particular request ID and get access to that role. Now, once Bob has gotten that request, he then gets an updated certificate with his particular roles, and now he'd be able to see additional databases. So previously, we only saw the four databases. Now, with this new role, we can get into other tiered databases, just refreshing that so you can see a little better. Now, if I want to connect into let's say this MySQL database prod, then I can use the tsh db connect.

[silence]

Steven: 00:36:42.686 So what that did there, it did a couple of things automatically for me. When I did the db connect, it requested this certificate for the user. When I did that request, it was verifying that my user has access to that particular database which remember has that sort of tag. It will confirm that my user and I have access that schema. Once I'm in here, then I can run that query. In this case, I'm just listing the set of employees. In the case of our monitoring, then we would be able to see that particular query in real-time. So right there, we can see Bob connecting into that database service, into that name. So that gives you that real-time possible alerting as users are using the system, where are they connecting to? What are they doing? And making sure that you're following the zero-trust approach so the person does not automatically have access to production or other areas that are not appropriate.

Steven: 00:38:02.981 Now, in terms of how we actually built that configuration, let's just take a quick look in terms of one of the roles. So in this case, we've given that access just to this label tier dev. We've said that they can access other names or users within there. So that allows us within this role to configure what they can do or not do. We could be more restrictive and say that, "Well, they can only see the classicmodels database," and then we may take away other users. So that allows us to even further restrict what they're allowed to do in that we're saying that this is the only users they can have. We also have another role access where we determine what type of roles they can ask for. As you can see here, I have a configuration where I can ask for access prod. You also can have a PagerDuty integration in addition to the Slack one you saw about where is that going to in terms of PagerDuty. And as I mentioned before, you can have it configured that a person could be auto-approved by a process like PagerDuty, where if they're on a particular schedule, they may be approved just because they're on duty during that period of time. And then in other periods, someone would have to approve them as you saw me do here directly as opposed to being auto.

Steven: 00:39:39.931 Now, in terms of what we've run through here, we've shown how you can provide access into various databases, whether they're the day-to-day databases a person works with like a Dev One. We've shown how you can get that access elevated, which you can access to other types of databases. Also shown how in real time you can see the interaction with user. What are they accessing, what queries are they running? And then send that into a central logging mechanism so that way you can monitor both over time and then in real time what they're doing and possibly doing [inaudible]. Now, in terms of as you want to learn more, our goteleport.com is a great resource. Under there, you can go into our docs as well and our database guide. We'll step you through really detailed how-tos. How to set up Teleport the right way, how to connect to your databases, the way you set your users, and very concrete examples. What I did there would be very simple for you to do, setting up a Postgres connection, setting up a MySQL as well as there you can see things like MongoDB, CockroachDB. So all of those things are very easily laid out. And a lot of it is to make this as seamless as possible so that there isn't that disruption. Some tools really try to push for it's all security or it's all access. And we want to find the right balance of both in that we want to make sure that as the tool comes in, it's not disruptive. As you saw, I'm using a web database client, I'm not locked down to connecting to a particular bastion server while it's all auditing who I'm in there as, and the system still has the ability to control my connections if there's any kind of detected problems. All right, Stephen. Well, I feel ready to move to any QA.

Q&A

Stephen: 00:41:56.760 Fantastic. Thank you very much, Steven. Now for some questions from our viewers. First question, is Teleport validated by any companies for its security?

Steven: 00:42:09.860 Yes, at least annually and sometimes more, we work with the company Doyensec, which is one of the leaders in terms of security reviews. All of our security views are published under our resources, and in there you would find details of what were their findings, and how were they addressed. Those are available at least yearly. You're welcome to look at that, and you can always contact with any questions.

Stephen: 00:42:44.279 Understood. Does the growth in cloud computing across organizations make security easier or more difficult these days?

Steven: 00:42:55.054 I think we've seen a mix of that. A lot of it is we have many clients who are fully off the cloud and air-gapped. And Teleport initially was made to really help with — part of its goal was to help with air-gapped environments. And in those, you can be more fully in control. And I think it just requires that if you're going non-cloud, that you're willing to take on the operation of a non-cloud environment. The servers, the mechanisms, the data centers, power backups, things of that nature. So I think it's still the case that you can do both. It's just that you have to invest in both, one or the other or both. A lot of our clients are hybrid, so I don't know that it really changes, makes it more or less secure. It's just that we can see that the cloud environments can allow you to achieve things by a certain cost. They often can offer you services that may take too much time for yourself to invest in how to do it. So it's really a trade-off in most cases. So I wouldn't say it necessarily changes security that much.

Stephen: 00:44:20.629 Understood. Thanks for clarifying. Next question, does Teleport keep any of the data that is being returned from the database?

Steven: 00:44:29.258 No. Teleport will just keep, as you saw, the database queries, so it will store that in the auto log. You do have control over who can access that auto log. So in this case, I was allowing the user to see that. You can have it where they have no access to the audit log and would just be able to do access requests.

Stephen: 00:44:52.777 Got it. Our next question, which part connects to the database, the proxy or the agent?

Steven: 00:44:59.707 The agents do. So the proxy just facilitates access to the services. So wherever Teleport is running, Teleport has a set of services enabled. In the case of the resources, we consider that as the agent services. And then wherever those agents are running, it's doing the local or network connection to the particular resource. The proxy itself, that service plays no part in directly connecting to the database.

Conclusion

Stephen: 00:45:32.160 Understood. We're through the questions that were asked. But I actually have a question for you, Steven. If there's one thing you would really like our viewers today to walk away keeping in mind, what would that be?

Steven: 00:45:45.768 I think that they really want to consider how to best access and audit their particular services, making sure that they have a way of having a resilient operation for that and a method that they feel safe with. I really think that that's the most important thing that you're able to do that.

Stephen: 00:46:18.510 Well, I would like to give a big thank you to our speaker today. Once again, Steven Martin, Solutions Engineer Manager at Teleport. If you would like to review this presentation or send it to a colleague, you can use the same URL that you used for today's live event. It will be archived and you will receive an email once the archive is posted. And just for participating in today's events, you could win this $100 Amazon gift card. The winner will be announced on January 31st. We will reach out to you via email if you are the lucky viewer. Thank you, everyone, for joining us today, and we hope to see you again soon.

Join The Community

Background image

Try Teleport today

In the cloud, self-hosted, or open source
Get StartedView developer docs