0:00 So just to, you know, just to give you an idea about the architecture here and then we'll get directly into demo. 0:07 So what Anusha was talking about the unified fabric, you know that does all these SASE engineering together for us, which which is right here in the center where you have cloud, ZDNA, SWIG, SCC services, all instantiated in the cloud, consumable in the cloud, can address users coming in from remote work locations, from offices, whether it is VDI infrastructure, so machine devices, right? 0:36 So all of that could be addressed through this unified SASE fabric. 0:41 This unified SASE fabric would address both profiles of traffic which is private and public. 0:46 So if the traffic is towards the SAS applications, towards public Internet, the policies could be enforced right at these cloud gateways. 0:55 If the traffic is towards your private application sitting in your data centers or in your private instances in cloud infrastructure like Azure, AWS, that again you know you can enforce specific policies, very granular ZDNA controls for the right profile of traffic. 1:12 So for this demonstration, we would be focusing on the cloud DLP, network DLP and I am coming in from a remote machine that you see right here. 1:23 So I'm working from home. 1:25 This is the user and we'll see how we can make sure that there is no sensitive information that is exfiltrated. 1:33 Let me take you to the management plane, which is Concerto. 1:37 So in the previous versions of this series, you would have seen we always talk about this unified management pane with which you can do your Internet protection, private app protection, remote client connectivity management, complete SASE administration, managing your cloud security cloud SSE functions. 1:55 So, so that's the reason we call it unified SASE platform. 1:59 The complete end to end life cycle can be managed, not just the configuration aspect, the monitoring aspect, management aspect of the solution. 2:06 All of that could be managed from one single pane. 2:09 Most of you guys are our existing customers. 2:11 So you have your on premise as even versions, as even flavors, this cloud infrastructure could be could seamlessly integrate with your on premise SD Wan fabric to give you end to end visibility, like what kind of traffic is flowing between your branches to the cloud gateways, whether those are dedicated or shared. 2:31 So this kind of gives you unparalleled visibility, which would help you in operations capacity planning and and lot many things. 2:38 So so ABI just unfold that. 2:40 So a lot of you folks have director, so they would keep using Director for SD Wan configuration. 2:45 They'd use this Concerto tool to configure SSC. 2:49 But then in Concerto they could see both the SSC and the SD Wan in that single pane of glass. 2:56 From a monitoring perspective. 2:58 That's correct, Clint. 3:00 So typically, you know, most of as you mentioned, you know, or most of our existing customers, you know they have their own on premise secure SD Wan infrastructure that they are managing or you know, Co managed with partners or with versa. 3:16 And then what we are proposing is a shared or a dedicated cloud SASE fabric. 3:21 So what we are dealing with is like today, this is what's going to be unified SASE, where while the on premise infrastructure is managed by customer had in the cloud infrastructure, they would be a dedicated or shared had in. 3:34 But then both of this could be managed using a single concerto so that when they log into the management plane, which I was showing here, the 11 twent visibility. 3:46 So whenever versa is presenting anything on the unified CC front, you will see concerto. 3:51 You know, that's the unified management platform. 3:54 And because we are focusing on DLP. 3:56 So I just want to expand upon this DLP right here. 4:00 So under real time protection, when you click on profiles, so this is where you'll see all the different SSE functions and SWIG functions. 4:08 So let's take a look at DLP right here. 4:14 Now Anusha in her presentation was talking about the capabilities, you know, and then what I want to highlight here is that it's a very granular approach that we have taken now because this thing is not consumed independently like most of the on premise DLP solutions are that, you know, those are systems. 4:30 And then if you have to integrate that with ZT and the approach, it's like managing two different systems. 4:36 When you consume cloud DLB, what happens is this thing is working very closely with other functions. 4:42 Anusha highlighted CASB. 4:43 It also works very closely with the ZD and a model. 4:46 So where I'm going to is that you can have multiple DLP profiles like you see here. 4:51 And then, you know, these could be specific to business groups, you know, departments, your partners, maybe depart, you know, the version, the profiling of traffic. 5:01 Let's say you have, you know, like sensitive traffic identified based on applications, or you have sensitive traffic identified based on business functions or maybe the way the traffic is exchanged between partners and associates. 5:16 So there could be various parameters that you want to, you know, identify the traffic, right? 5:21 And then apply that specific DLP profile. 5:24 For example, if a person is an HR or let's say it's a financial vertical and they don't have access to code banking application, there is no point of enforcing, you know, some custom data protection profiles at all. 5:35 So that's the reason we provide customers flexibility to create various DLP profiles that suits their environment and make it specific. 5:44 Don't inspect the traffic unnecessary when it is not required, which would again, you know, give you better results in terms of performance because some of these functions are extremely compute intensive. 5:54 So profile is what would get associated to the actual traffic flow, you know, in the access policies. 6:00 But then what is there in this DLP profile, For example, we would see this export ban global DLP profile. 6:07 That's the tenant that I have and I'm just calling this global DLP. 6:10 I just have one global DLP profile here where it's a collection of various rules that we have created. 6:17 Now rule is where we would define specifics, you know what kind of rule it is. 6:22 As Anusha mentioned, there are like 5 broader categories based on which you could create DLP rules. 6:28 So those those are like content analysis file, DLPEDM, fingerprinting, OCR. 6:33 So all of that goes into the rules. 6:35 And then the collection of rules would call ADLP profile. 6:40 And then we will pick this DLP profile and associate to the actual access policies which would have granular ZDNA policies associated. 6:49 So let's look at what is there in these DLP rules. 6:52 So for example, if I try to create a new rule, so these are those five constructs, you know, bases around which you could create your, your DLP rules, it could be content analysis file, DLPOCR exec data match and document fingerprinting. 7:07 Now to take a look at content analysis, this is where what we do is as part of the package, as part of our offering, we provide a lot of these predefined data prediction profiles. 7:18 These are like based on various regulatory standards, etcetera. 7:22 So that, you know, you don't have to write your own custom data prediction profiles. 7:26 You could just use that, for example, USP I, I data or if it is PCI DSS, you know, so you just have to call that particular predefined data prediction profile and you can do content analysis right away. 7:39 If you want to write your own custom data prediction profiles, then based on regex, based on keywords, that's when you would use the custom data prediction pattern that we support that would pop up right here in the user defined. 7:52 And then like you can see here I've got various, you know, user defined data prediction profiles available. 7:58 So we can just use that. 7:59 And then we go to the next section where we can specify what type of file types. 8:05 Again, the objective here is to avoid false positives and negatives are given this opportunity to make those rules very specific. 8:12 If you don't expect, you know, let's say for example, the business says that it has to be doc or docx or PDF file for that particular profile of traffic, then there is no point of analyzing various other types of files. 8:24 So that's where you know, this would definitely help and would result into a good overall user experience. 8:32 So this is where you could define various file types and then you can go to the next where the the content and the context is defined. 8:41 So this is where you could control what kind of activity is it up or download or both. 8:46 Then you can define the protocol and then the context, so whether it is header body attachment. 8:51 So you could define that. 8:54 Next is if you want to exclude some files based on the file names. 8:58 So we support that as well. 9:00 And then once you are done, then you would define the action. 9:03 So this is where these are the various actions that we support. 9:07 And once you you, you, you, you provide the actions, say, for example, block. 9:12 And then you want to log that event. 9:15 And then you can also enable notification profile. 9:17 So this is like if you want to send e-mail notification to the user or the admin, you know that there are some kind of events have happened. 9:24 So we can notify the users or administrator with the events or summary of the events that has happened in the past one day, five days. 9:32 So all of that is customizable, but this can be definitely sent. 9:35 And then as you can see here that if you want to set some kind of label, the MIP label that Microsoft offers to O365 subscriptions, we can put those tags in there. 9:47 So let's say if something comes in and you know, it has some specific pattern and then you know when you are sharing it, you may just want to put a MIPMIP label onto it so that that kind of, you know, tags as top secret, confidential or whatever you have. 10:01 So those capabilities are also available here. 10:04 So this way, you know, we basically end up creating various rules. 10:07 I just showed the content analysis. 10:10 We have the same thing for OCR or file DLP for exact data match and fingerprinting. 10:17 And then once you know we have these rules written, we can go to the DLP profiles, create that collection of rules. 10:25 Let me show you here. 10:26 So this is my export band global DLP profile, which is a collection of these rules. 10:31 You can pick and choose the rules that make sense for that profile. 10:35 And then, you know, what you have to do is you can put in here the order. 10:42 And then why order is important because many a times, right? 10:45 Like again, this is a feature that we support wherein from logging perspective, you want to make sure that any sensitive document that's scanned has got multiple sensitive, sensitive information. 10:56 Let's say it has credit card information, Pi information, some other stuff. 11:01 So you want to have everything logged. 11:04 So in that case, you may want to scan all these rules. 11:07 But in certain cases the objective is just to scan. 11:10 And if there is any sensitive information that is getting exfiltrated, you want to block that activity, then you would just say exit on first match. 11:17 The the the the feature that you see here. 11:19 So you can say exit on first match. 11:21 And then we would not go through the all of these rules. 11:24 Let's say the moment it's API hit, we will match it, take the action and exit from there. 11:30 So that that option is also available. 11:33 Then we also have forensic features. 11:36 So that this is very, very important. 11:38 You know, when when we talk about network DLP or any DLP solution, why that particular flow is blocked, what was sensitive and for what kind of sensitive data was getting exfiltrated. 11:48 So with forensic we get all that data. 11:50 We provide synopsis of that in analytics which is the logging and the analytics engine again integrated to this unified SARC platform. 11:59 But we also provide capability to, you know, encrypt that sensitive payload or the data sample, export or upload it to the cloud infrastructure, this infrastructure that we have. 12:09 And then we also provide the forensic data, you know, as to why that particular file was flagged as sensitive file and then why it was blocked and then why what was there in that. 12:21 So all of that data could be done here. 12:29 So as you see here like just to show it to everybody that when it comes to creating custom data patterns, we support that. 12:36 You could have various within your organizations. 12:38 You may have something that you qualify as a sensitive information. 12:41 It could be invoicing, it could be account details, it could be some internal employee ID, etc. 12:46 So you could create regular, you could create a sensitive data patterns based on regular expressions and keywords, and then we can call them in these custom data protection profiles. 12:57 So other than the whole database of predefined data prediction profiles that we have, you could create your own data prediction profiles and then we can call them in the DLP rules. 13:07 The other important aspect of these custom data prediction profile is that the combination that Anusha was highlighting that you know, you you to make it really meet your business requirements and technical requirements. 13:19 Sometimes it's required to, you know, have a rich a feature set to, to make it right like these boolean operations so and or near not. 13:28 So we support all those operations so that you could, you could create exact data protection profile that you need and then call that in the DLP rules and enable your DLP profiles. 13:38 So this is like a flow. 13:40 Once we are ready with our DLP profiles, the next thing is we, we associate this DLP profile to our Internet protection or private app protection access policies. 13:50 So for this demonstration, I've got this demo access policy. 13:54 And then in this policy, as you can see, I'm just, I'm not like filtering based on the application. 13:59 I'm saying for all the applications, all the users, any kind of posture, I'm not checking any of the posture or the risk score of that particular user. 14:08 Wherever the user is coming in, wherever they are going, no constant specified on the service side, the zones are specified whether the user is coming in from my SD van zone which is from a branch zone or the user is coming in from the remote client. 14:23 It can also support users coming in from client less remote users which is using pack file based VPN connection going towards Internet. 14:32 So you can define it to be very precise. 14:36 And then what we do is once we have our traffic identified, we would enforce our security policies. 14:42 So here I have DLP here and then you can see that I'm calling expert band, global DLP profiles. 14:51 All these rules are mapped to that particular traffic flow or traffic flow putting into this access policy. 15:00 So this is how we enforce cloud based DLP protection. 15:05 Now once the configurations are done, comes the operational aspect, you know, how do we see live users and all of that. 15:13 So let me go to the monitor page. 15:19 So I'm connected to San Jose Gateway. 15:26 OK, let me go and analytics. 15:35 OK, so let me go to export band. 15:39 This is the tenant where we are doing this POC demo. 15:43 And if you see here, the first thing that I want to talk about is right here. 15:49 So you can see this is March 12th, 2024. 15:52 So that's me who is logged in here Abhishekam at expertband.com. 15:58 I logged in at 12:21 and then this is my SASE client. 16:03 So you can see that this is the user and I'm connected to San Jose Gateway. 16:07 So this is basically what I'm trying to show here is that that's me. 16:11 I'm I'm on the on the infrastructure. 16:14 The second thing that I want to talk about is let's run some live traffic here. 16:18 So I have my OneDrive open. 16:23 I'm using OneDrive application to demonstrate these capabilities. 16:26 So let me go to a particular directory. 16:29 And the first thing that I want to try here is try to exfiltrate some sensitive information. 16:34 You know, let's say some PCI or Pi data. 16:38 So I'm going to use payload from, you know, these typical Internet based website. 16:45 So I'll use I'm going to use this this file here. 16:56 So this has PA data, this has PCIDSS data also, you know the standard test document here. 17:03 So let me try to upload it. 17:09 So what, what what's happening here is that the cloud gateway intercept the flow, look at it, complete decryption and the cloud Gateway identifies their sensitive information. 17:22 And then cloud gateway would send a notification back to the agent that's running here and says that hey, I think this is a DLP violation. 17:31 So the moment that happens the the session gets reset, mail would be sent to the network administrator or if it is configured in a way that you know the user should also get some kind of notification, then the user would also get the notification plus you know this information, this flow would be logged on the analytics. 17:50 So if we go onto the analytics and go to the DLP and I can go to that specific gateway which is San Jose. 18:00 So you know when you are in a production environment, you know you could have multiple gateways. 18:05 So you have the capability to look at that specific gateway in a specific duration. 18:11 So if you are looking for activity that happened like last week or last to last Wednesday between 3:00 central and five central, you could specify the times time stamp here and then look for the flow logs here. 18:22 So this one is just happened 1241 application was SharePoint user. 18:28 What type of rule was that? 18:30 Then you have you know, the data pattern, the profile, the file name, file type, the AD, the direction of the flow and the file rule name. 18:43 You know within that particular DLP profile there could be multiple rules. 18:50 So as you see here, because I did not have exit on first match, you could have within a particular file there could be multiple. 18:57 So you can see here that this is AP, there is API data that is PCI data in the same file. 19:04 Now what is not shown here is a rule violation field which you know for some reason I had to disable because of some other stuff. 19:14 But what we also provide is forensic data. 19:18 So what would happen is there would be another field here which would have exact information. 19:22 You know, it would be obfuscated obviously, but it would maintain the format. 19:28 So credit card number would be in that format. 19:30 It would, if it's SSN number, it would be in that format. 19:33 But then it'll tell you that, you know, because of that it was identified as sensitive information and was has blocked it. 19:39 Also, as I said, if you have, if you choose encrypt and upload, then the payload would be encrypted and uploaded and to a cloud infrastructure library from where you can assess it and take a look at the forensic file to see why versa flag this as a data leak, potential data leak and take appropriate action. 20:01 So the next test that I want to run here is, you know, we can try OCR. 20:06 So let's say you know how worse I can identify an image, you know, which is not a typical text file. 20:11 So we process it on the cloud gateways. 20:14 We have various. 20:16 Different technologies to look at different file types, you know, and then we process that to look at what actually is the content and then we analyze it against based on the rule that you create. 20:25 So if in the rule that you say, you know, OCR rule, you're saying that scan, you know, pictures for PCI, DS or Pi data, all of that. 20:34 So it will accordingly look at that and scan it and then take the corrective action. 20:39 So let's try to upload some picture here. 20:45 So I've got this picture. 20:47 It has SSN number, rate of birth, which is a typical PII data. 20:53 So let's try to upload this. 20:59 And then you can see here that Versa DLP engine again does the same thing. 21:03 It blocks it because that's how the configuration is. 21:07 And then when we go to the analytics to look at the data, it's optical character recognition, match application information. 21:20 All these pieces are in here file name and all of that file type. 21:28 So that's on the OCR side. 21:31 And then, you know, if you want to look at some of the file based DLP that you know, Anusha was talking about in her presentation, which is like say MIP is something that is kind of important for a lot of enterprises using Office 365. 21:46 So what we can do is we cannot, we can you know, take actions based on the MIP label. 21:52 At the same time, what we can do is we can look at a particular file and then, you know, match it on and force a particular, you know, label on it. 22:02 Like for example, OK, your verse are confidential. 22:05 For example, if a document is tagged as verse are confidential, what we can do is we basically can read the tag and then change the tag or retag it. 22:19 So if I quickly show you this document, so this is some patent document which is like a. 22:26 So as you can see it has versa confidential MIP label right here. 22:30 If you see on the top, it has versa confidential label. 22:34 So the rule that I've written is it says that if you find anything that has MIP label or versus confidential, block it. 22:41 You know you just want to block it. 22:43 So let's see if Versailles is able to block if I try to upload this document. 22:48 And if I go back to my demo setup and I try to yeah, that's the file. 23:08 So let me try to upload this. 23:14 OK, so here we see that the file is blocked by DLP, but let's see why this is blocked by DLP. 23:20 So I'm going to go back to analytics, refresh the page here and I will have to wait a little bit. 23:29 I just tried it like, you know, sometimes back 11:43 you can see is the same thing. 23:34 So file level matched. 23:36 What's that confidential? 23:38 That's the match string. 23:39 So based on this label the action is taken as block right here. 23:44 And then likewise, you know, if you want to tag it with a new label, like you can just want to tag it based on some parameters, then we can do that also. 23:54 I mean, there's a, there's a question in chat of what if we're doing this, let's say for a use case where we're not using the Versa software client on the end station, it's devices and let's say a retail store and the, the, the Edge device Versa or someone else is sending untrusted traffic to the Versa cloud gateway. 24:12 Would the DLP still work in that case? 24:16 Yeah, D DLP does work. 24:17 It's on the service side, on the, on the gateway side. 24:21 This enforcement is going to be on the on that. 24:25 This is not, we're not looking at specific endpoint profiles or device posture here. 24:31 So this is more like, you know, what is the action? 24:34 Oh, somebody's uploading a document to Box. 24:37 So let us look at what that means. 24:39 Does it have sensitive data and, and that's what is going to be flagged. 24:43 So this is entirely done in the Versa Cloud Gateway and in our advanced security cloud. 24:50 It's not done through endpoint. 24:53 OK, so in SNT it would be blocked, but the user, since there's no versus software client, the user wouldn't know why it was blocked. 24:59 It would just wouldn't work. 25:00 Correct. 25:02 The pop up that you saw will be seen by the end user. 25:06 The violation, right. 25:07 The policy violation of why it was blocked. 25:09 The user, the end user should know because the admin has already configured the policies. 25:14 Even if they don't have the client, the they should be able to see the pop up. 25:18 So that's that's rendered by the Versa cloud gateway. 25:21 It's not Oh, I thought that was coming from the software client that OK, that's rendered through like a a browser then I'll just add there. 25:28 So when you are coming in from a client, the notification would be, you know, as you saw here, right. 25:35 It was like a tiny window that popped up on the right side and it was the look and feel was like as if it's it's coming in out of this SASE client. 25:44 When users are coming in from, you know, non client environment, they don't have any client. 25:50 Then in that case, what what we do is we basically leverage upon, as Anusha mentioned that we basically leverage upon Java scripts and all that so that the gateway can talk to the user over the browser. 26:03 And then we invoke, you know, in the JavaScript and we show that it it's more of a dialogue box. 26:09 It's it's little different than this, but it's a dialogue box that would just come up in the center and then it would say that, you know, sensitive information exfiltration identified and the activities blocked by the administrator. 26:20 Got it. 26:21 OK. 26:21 And then time check. 26:22 We're we're at 12:51 and and the meetings at one there is like two slides on private SASE. 26:28 We want to leave a little time for discussion. 26:30 So yeah, I am almost done here. 26:33 So just let me run some EDM test case here. 26:37 So in EDM, you know, what we can do is we can pick up your repository, you can upload those files to a trusted source onto the cloud instance. 26:47 And then you know, it's a one way hash that we do. 26:50 So once I cannot read the content on what is there as the trust source. 26:55 And then you know, you can enforce various boolean operations based on the patterns. 26:59 So let's say the file uploaded has employees, i.e., SSN number and various of the parameters, right? 27:05 So what we can do is as part of this EDM configuration, let me show you quickly here, not here, sorry. 27:15 Let's go back to the DLP configuration. 27:18 And then here I have EDM rule. 27:21 So here as you see that you know, you create, you define these expressions which are part of that trust source document that we have uploaded. 27:30 And then here you know you can take various these operations, you can. 27:34 So it's like either if I say MRID or SSN number, that qualifies for a match and then, you know, the action would be enforced. 27:43 So in in this demo, what I would be doing is I have a web form that I will try to upload here and we'll see. 27:51 Let me just quickly show you that thing. 27:58 So this is that web form. 28:01 So here you can see that database versions, it's like somebody's trying to exfiltrate the SSN number is defined as like the actual, you know, so you can see this is the number 323456789. 28:13 And then what we would do is when this document is getting upload would redact that. 28:20 So let me try that and let's see what, how does it look now? 28:38 So if you look at this thing, so this it identifies that, the engine identifies that and it basically redacts it. 28:45 We also support tokenization. 28:47 So as Anusha mentioned, you know, in in certain scenarios, we need to maintain the the format. 28:52 So we could do that. 28:53 We would just randomize the numbers, all those characters. 28:56 So that is also supported. 28:58 And that's on EDM. 29:01 The last thing that I have here is fingerprinting. 29:05 So for fingerprinting, what I have is I have this document that I have uploaded. 29:11 So as as Arusha was saying like you have to upload the reference document, you know, and then when you create the rule, so here let me open the rule document fingerprinting. 29:24 So in here what we do is we basically define the similarity threshold. 29:29 So it would again, it uses all of this AI ML stuff to look at match it, you know, see if it matches on what level of similarity exists, exists between the the reference document and the and the one that we are testing against. 29:44 And then accordingly, you know, the action would be taken. 29:47 So to upload your reference documents, what we have is in the settings space you have files and folders. 29:57 So this is where you know for various functions like DLEDM, I've uploaded the trust source fingerprinting. 30:03 This is where I've uploaded the file. 30:04 So this is the file that I've uploaded, which is the reference document. 30:08 And now when I try to test it with a filled form, that's fingerprinting. 30:15 So this is my filled document. 30:17 So here I've got some data and the threshold that I've defined as 50%. 30:22 So what's going to happen is it would look at, analyze it, scan it, and then, you know, look at the document, you know, similarity index and then it takes an action, OK, you know, I think this is less than 50%, so I'm going to block it. 30:36 This is the reason why Versat Cloud Gateways block this. 30:40 And then all this information, all of these logs will be available in analytics for analysis quickly show you for expert band San Jose gateway last day. 30:57 And here you have exact data match. 30:59 And then you will see you'll have document fingerprinting flows coming in. 31:05 Yeah, right here. 31:09 So whatever you have seen in last 3035 minutes, it was an actual user. 31:13 I was connected to a cloud gateway. 31:15 I was doing my work and intentionally, unintentionally, whenever I tried exfiltrating some information, the cloud gateway identified it, blocked it. 31:24 And from network or SASE administrator perspective, we would have notifications, we would have forensic data, all of this available to to, you know, identify, you know, loopholes and take actions. 31:37 That's it from my side, then over to you. 31:39 All right, thank you. 31:41 Any questions for ABI? 31:43 I think there's one in the chat from Raju. 31:45 Maybe we can take that. 31:47 So the question is DLP has been associated with CASB for preventing data exfiltration. 31:52 That is true for Internet bound traffic. 31:56 But yeah, the fact is that DLP is also very valuable when coupled with versus zero trust network access, which is for VPN replacement for private apps for private access. 32:09 So it means that we can detect data exfiltration even on to a private app. 32:15 It could be it is the same intellectual property leakage, it is the same sensitive data leakage, but the same thing can be detected even for private apps and that's that is very valuable.