Quantcast
Channel: Glen's Exchange and Office 365 Dev Blog
Viewing all 241 articles
Browse latest View live

Email Header IpAddress GeoIP report Addin for Outlook and Outlook on the Web in Office365

$
0
0
Something that can be useful from time to time when looking at email delivery issues or email threats is to be able to see the Geographical regions that an email has traversed in its delivery. Usually this information gets stored in the Email Header in the received headers but also depending on the client and services being used the Source IpAddress of the client and other intermediaries may get written in other properties.

Because I needed something last week to do this and couldn't find any other addins to do this I created a pretty simple Outlook addin that

  • Gets the headers from a Message using the REST API in Office365
  • Uses a RegEx to get all the IPAddresses from that header
  • Uses a Set in JavaScript to then de duplicate these IPAddresses
  • Then I used one of the many free GeoIP web services out there to query each of the returned IPAddresses from the Regex matches and finally display the result in a table but to Outlook
For example here is what it returns where run against a normal gray email that was forward to my Office365 Mailbox from Gmail

For a quick diagnostic this information is pretty useful as it tells you where the email traversed and the Org information generally will tell you the cloud providers being used. Doing this on email in my junk email folder that where of a nefarious nature showed that the these emails had transitied through  countries that are the usual suspects in this type of activity.

I've hosted the code up on GitHub here https://github.com/gscales/gscales.github.io/tree/master/MailGeoRoutes so the addin can be added straight from the Repo if you want to try it using (MyAddins - Custom Addins) and

https://gscales.github.io/MailGeoRoutes/MailGeoRoutes.xml

This uses REST to get the header so will only work on Office365 but the same thing could also be done using EWS for OnPrem servers from Exchange 2013 onward.



Accessing the Microsoft Graph directly from the new Arduino nano 33 IOT

Using a System-assigned managed identity in an Azure VM with an Azure Key Vault to secure an AppOnly Certificate in a Microsoft Graph or EWS PowerShell Script

$
0
0
One common and long standing security issue around automation is the physical storage of the credentials your script needs to get, whatever task your trying to automate done. The worse thing you can do from a security point of view is store them as plain text in the script (and there are still plenty of people out there doing this) a better option is to do some encryption (making sure you use the Windows Data Protection API) eg https://practical365.com/blog/saving-credentials-for-office-365-powershell-scripts-and-scheduled-tasks/ . Azure also offers some better options with ability to secure the credentials and certificates in RunBooks, so it is just a few clicks in the Gui and some simple code to secure your credentials when using a RunBooks.

In this post I’m going to look at the issues around storing and accessing SSL certificates associated with App only token authentication that you might be looking to use in Automation scripts. This is  more for when you can’t take advantage of Azure Runbooks and need the flexibility of a VM.

In Exchange (and Exchange Online) EWS Impersonation has been around for quite some time, it offers the ability to have a set of credentials that can impersonate any Mailbox owner. With the Microsoft Graph you have App Only tokens which offers a similar type of access with the additional functionality to limit the scope greatly to certain mailboxes and Item types . With App Only tokens you don’t have a username/password combination but a SSL certificate or application secret (the later should be avoided in production). So instead of the concern around the physical security of the username and password combination, your concern now is around the security of the underlying certificate.

One of the most important points is the time to start thinking about the security of the certificate is before you generate it. Eg just having a developer or Ops person generate it on their workstation leaving copies of the certificate file anywhere is the equivalent of the postit note on the monitor. This is where the Azure KeyVault (or AWS KMS) can be used to secure both the creation of the certificate and provide the ongoing storage and importantly Access Control and Auditing. So from the point of the creation of the AppOnly cert you should be able to have an audit trail of who created it and who accessed it. The other advantage of having the cert in a KeyVault is that it also makes it easy for you to have a short expiry on the certificate and automate the renewal process which in tern makes your auth process more secure.

Once the authentication Certificate is stored in the KeyVault then the weakest link can be the authentication method you then use to access the KeyVault. Eg a user account can be granted rights to the KeyVault (which then make that user account the weakest link) or you could use another Application secret or SSL Certificate to secure access to the data plain of the KeyVault. At some point all of these become self defeating from a security point of view as your still storing a credential (especially if you then store that as plain text) and for a persistent threat actor this still leaves you exposed.

One way of getting rid of the storage of credentials is the use of Managed identities where (“in my mind at least”) your trusting the infrastructure where the code is running. The simplest example would be say you create an Azure compute function and give that function a Managed identity, you can then grant that access to the KeyVault and your function can now access the certificate from the KeyVault and then authenticate to Azure and access every Mailbox in your tenant. So now you have placed the trust in the code in the function and the underlying security of the function (eg can the code be exploited, could somebody hack the deployment method being used and replace the code with their own Etc). With a System-assigned managed identity in an Azure VM your doing the same thing but this time the point of trust is the Azure Virtual Machine. So now its down to the physical security measures around the Azure VM which becomes the weakest link. Still not infallible but your security options around securing VM are many, so with good planning and practice you should be able get a balance between flexibility and security.
So lets look at a simple implementation of using a System-assigned managed identity in a Azure VM in a Powershell Script to get a SSL Certificate from a KeyVault and then access the Microsoft Graph using an AppOnly token generated using that certificate.

  1. This first thing you need is an Azure Key Vault where you have enabled auditing https://docs.microsoft.com/en-us/azure/key-vault/key-vault-logging
  2. You need to create an application registration in Azure AD for your app that will be using the SSL cert from the Keystore to generate a AppOnly token and then use Application permissions for the task you want it to perform. https://docs.microsoft.com/en-us/graph/auth-register-app-v2
  3. Make sure you then consent to the above application registration so it can be used in your tenant (The portal now make this very straight forward)
  4. You need an Azure VM where you have enabled System-assigned managed identity https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/qs-configure-portal-windows-vm
  5. One your Azure VM has a System-assigned managed identity you should be able to grant that secret-permissions so it can access the SSL certificate we are going to store in the KeyVault eg 
  6. Next step is create the Self Signed in the Azure Key Vault using the Azure Portal
  7. At this step you should now be able to access the Self Signed certificate from the Key Vault on your VM with some simple PowerShell code. All you will need is the URL to Key Secret Identifier eg

      Them some code like

      $KeyVaultURL = "https://xxx.vault.azure.net/secrets/App1AuthCert/xxx99c5d054f43698f39c51f24440xxx?api-version=7.0"
      $SptokenResult = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fvault.azure.net' -Headers @{Metadata="true"}
      $Sptoken = ConvertFrom-Json $SptokenResult.Content
      $headers = @{
      'Content-Type' = 'application\json'
      'Authorization' = 'Bearer ' + $Sptoken.access_token
      }

      $Response = (Invoke-WebRequest -Uri $KeyVaultURL -Headers $headers)
      $certResponse = ConvertFrom-Json $Response.Content
      With the above example you’ll see the following hard-coded URI

      http://169.254.169.254/metadata/identity/oauth2/token

      This isn’t something you need to change as 169.254.169.254 is

      The endpoint is available at a well-known non-routable IP address (169.254.169.254) that can be accessed only from within the VM
      https://docs.microsoft.com/en-us/azure/virtual-machines/windows/instance-metadata-service

      So the above script uses this local Metadata Service to acquire the access token to access the Azure KeyVault (as the System-assigned managed identity). One you have the certificate raw data from the KeyVault you can then load it into a Typed Certificate Object eg

      $base64Value = $certResponse.value
      $Certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
      $Certificate.Import([System.Convert]::FromBase64String($base64Value))
      The last step is the certificate must be uploaded to the Application registration created in step 2 or added to the application manifest either manually or pro-grammatically. eg the following is an example to produces the required manifest format described here https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-certificate-credentials


      $SptokenResult = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fvault.azure.net' -Headers @{Metadata="true"}
      $Sptoken = ConvertFrom-Json $SptokenResult.Content
      $KeyVaultURL = "https://gspskeys.vault.azure.net/secrets/App1AuthCert/xxx99c5d054f43698f39c51f24440xxx?api-version=7.0"
      $headers = @{
      'Content-Type' = 'application\json'
      'Authorization' = 'Bearer ' + $Sptoken.access_token
      }
      $Response = (Invoke-WebRequest -Uri $KeyVaultURL -Headers $headers)
      $certResponse = ConvertFrom-Json $Response.Content
      $base64Value = $certResponse.value
      $Certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
      $Certificate.Import([System.Convert]::FromBase64String($base64Value))
      $bin = $Certificate.GetCertHash()
      $base64Thumbprint = [System.Convert]::ToBase64String($bin)
      $keyid = [System.Guid]::NewGuid().ToString()
      $jsonObj = @{ customKeyIdentifier = $base64Thumbprint; keyId = $keyid; type = "AsymmetricX509Cert"; usage = "Verify"; value = $base64Value }
      $keyCredentials = ConvertTo-Json @($jsonObj) | Out-File "c:\temp\tmp.key"

      This puts the certificate data into a file temporarily which isn’t great you can actually use the Graph API to create an app registration and add the cert data directly which means the cert data never needs to be export/import into a file.

      Authenticating with the SSL Certifcate you retrieved from the KeyVault

      Once you have the Certificate loaded you can then use the ADAL library to perform the authentication and get the AppOnly access token you can either use in the Microsoft Graph or EWS eg

      $ClientId = "12d09d34-c3a3-49fc-bdf7-e059801801ae"
      $MailboxName = "gscales@datarumble.com"
      Import-Module .\Microsoft.IdentityModel.Clients.ActiveDirectory.dll -Force
      $TenantId = (Invoke-WebRequest -Uri ('https://login.windows.net/' + $MailboxName.Split('@')[1] + '/.well-known/openid-configuration') | ConvertFrom-Json).authorization_endpoint.Split('/')[3]

      The ClientId is the ClientId from the application registration in step 2 the following does the Authentication using the configuration information from the above and then makes a simple Graph
      request that uses the App Only Access Token that is returned.
      $Context = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext("https://login.microsoftonline.com/" + $TenantId)
      $clientCredential = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.ClientAssertionCertificate($ClientId,$Certificate)
      $token = ($Context.AcquireTokenAsync("https://graph.microsoft.com", $clientCredential).Result)
      $Header = @{
      'Content-Type' = 'application\json'
      'Authorization' = $token.CreateAuthorizationHeader()
      }
      $UserResult = (Invoke-RestMethod -Headers $Header -Uri ("https://graph.microsoft.com/v1.0/users?`$filter=mail eq '" + $MailboxName + "'&`$Select=displayName,businessPhones,mobilePhone,mail,jobTitle,companyName") -Method Get -ContentType "Application/json").value
      return $UserResult

      A Gist of the full script can be found here

      Using the MSAL (Microsoft Authentication Library) in EWS with Office365

      $
      0
      0
      Last July Microsoft announced here they would be disabling basic authentication in EWS on October 13 2020 which is now a little over a year away. Given the amount of time that has passed since the announcement any line of business applications or third party applications that you use that had been using Basic authentication should have been modified or upgraded to support using oAuth. If this isn't the case the time to take action is now.

      When you need to migrate a .NET app or script you have using EWS and basic Authentication you have two Authentication libraries you can choose from

      1. ADAL - Azure AD Authentication Library (uses the v1 Azure AD Endpoint)
      2. MSAL - Microsoft Authentication Library (uses the v2 Microsoft Identity Platform Endpoint)
      the most common library you will come across in use is the ADAL libraries because its been around the longest, has good support across a number of languages and allows complex authentications scenarios with support for SAML etc. The MSAL is the latest and greatest in terms of its support for oAuth2 standards and is where Microsoft are investing their future development efforts. A good primer for understanding the difference in terms of the Tokens that both of these endpoint generate is to read https://docs.microsoft.com/en-us/azure/active-directory/develop/access-tokens

      So which should you choose ? If your using PowerShell then the ADAL is the easiest to use and there are a lot of good examples for this like. However from a long term point of view using MSAL library can be a better choice as its going to offer more supportability (new features etc) going forward as long as you don't fall into one of  the restrictions described in https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/Adal-to-Msal

      In this post I'm going to look at using the MSAL library with EWS to access Mailboxes in Exchange Online. 

      Scopes

      One of the biggest differences when it comes to coding between the libraries with ADAL you specify the resource your going to use eg "https://outlook.office365.com" and with the MASL you specific the scopes you are going to use. With EWS its relatively simple in that  there are only two scopes (EWS doesn't allow you to constrain your access to different mailbox item types) which you would first need to allow in your Application registration which can be found in the Supported Legacy API's section of the application registration(make sure you scroll right to the bottom)

      Delegated Permissions


      Application Permissions (where you going to use AppOnly Tokens)

      .default Scope

      For v.1 apps you can get all the static scopes configured in an application using the .default scope so for ews that would look something like https://outlook.office365.com/.default . When your using App Only tokens this becomes important.


      App registration and Consent

      One of the advantages of the MSAL library is dynamic consent which for EWS because in practice your only going to be using one scope it doesn't have much use. However if your also going to be using other workloads you maybe able to take advantage of that feature. For the app registration you need to use the v2 Endpoint registration process (which is the default now in the Azure portal) see https://docs.microsoft.com/en-us/graph/auth-register-app-v2. This also makes it easy to handle the consent within a tenant.

      Getting down to coding

      In the ADAL there was only one single class called the AuthenticationContext which you used to request tokens. In the MSAL you have the PublicClientApplication (which you use for standard user authentication) and ConfidentialClientApp which gets used for AppOnly tokens and On-Behalf-Of flow.

      Endpoints 

      With the v2 Endpoint you have the option of allowing
      1. common
      2. organizations
      3. consumers
      4. Tenant specific (Guid or Name)
      For EWS you generally always want to use the Tenant specific endpoint which means its best to either dynamically get the TenantId for the tenant your targeting or hard code it . eg you can get the TenantId need with 3 lines of C#


      string domainName = "datarumble.com";
      HttpClient Client = new HttpClient();
      var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
      .Result.Content.ReadAsStringAsync().Result))
      .authorization_endpoint.ToString().Split('/')[3];
      In PowerShell you can do it with

      $TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]

      Delegate Authentication in EWS with MSAL and the EWS Managed API

      This generally is the most common way of using EWS where your authenticating as a standard User and then accessing a Mailbox. If its a shared Mailbox then access will need to be have granted via Add-MailboxFolderPermission or you using EWS Impersonation

      This is the simplest C# example of an Auth using the MSAL library in a Console app to logon (the currently logged on user).

      string MailboxName ="gscales@datarumble.com";
      string scope ="https://outlook.office.com/EWS.AccessAsUser.All";
      string redirectUri ="msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
      string domainName ="datarumble.com";

      HttpClient Client =new HttpClient();
      var TenantId =((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/"+ domainName +"/v2.0/.well-known/openid-configuration")
      .Result.Content.ReadAsStringAsync().Result))
      .authorization_endpoint.ToString().Split('/')[3];

      PublicClientApplicationBuilder pcaConfig = PublicClientApplicationBuilder.Create("9d5d77a6-fe09-473e-8931-958f15f1a96b")
      .WithTenantId(TenantId);

      pcaConfig.WithRedirectUri(redirectUri);
      var TokenResult = pcaConfig.Build().AcquireTokenInteractive(new[]{ scope })
      .WithPrompt(Prompt.Never)
      .WithLoginHint(MailboxName).ExecuteAsync().Result;

      ExchangeService service =new ExchangeService(ExchangeVersion.Exchange2016);
      service.Url =new Uri("https://outlook.office365.com/ews/exchange.asmx");
      service.Credentials =new OAuthCredentials(TokenResult.AccessToken);
      service.HttpHeaders.Add("X-AnchorMailbox", MailboxName);

      Folder Inbox = Folder.Bind(service, WellKnownFolderName.Inbox);


      AppOnly Tokens

      This is where your Application is authenticating using a App Secret or SSL certificate, after this your App will get full access to all Mailboxes in a tenant (it important to not that the scoping feature https://docs.microsoft.com/en-us/graph/auth-limit-mailbox-access doesn't work with the EWS so you need to be using the Graph or Outlook api).

      string clientId ="9d5d77a6-fe09-473e-8931-958f15f1a96b";
      string clientSecret ="xxxx";
      string mailboxName ="gscales@datarumble.com";
      string redirectUri ="msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
      string domainName ="datarumble.com";
      string scope ="https://outlook.office365.com/.default";

      HttpClient Client =new HttpClient();
      var TenantId =((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/"+ domainName +"/v2.0/.well-known/openid-configuration")
      .Result.Content.ReadAsStringAsync().Result))
      .authorization_endpoint.ToString().Split('/')[3];

      IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(clientId)
      .WithClientSecret(clientSecret)
      .WithTenantId(TenantId)
      .WithRedirectUri(redirectUri)
      .Build();


      var TokenResult = app.AcquireTokenForClient(new[]{ scope }).ExecuteAsync().Result;
      ExchangeService service =new ExchangeService(ExchangeVersion.Exchange2016);
      service.Url =new Uri("https://outlook.office365.com/ews/exchange.asmx");
      service.Credentials =new OAuthCredentials(TokenResult.AccessToken);
      service.HttpHeaders.Add("X-AnchorMailbox", mailboxName);
      service.ImpersonatedUserId =new ImpersonatedUserId(ConnectingIdType.SmtpAddress, mailboxName);
      Folder Inbox = Folder.Bind(service,new FolderId(WellKnownFolderName.Inbox, mailboxName));


      Token Refresh

      One of the big things missing in the EWS Managed API is a callback before each request that checks for an expired Access Token. Because tokens are only valid for 1 hour if you have a long running process like a migration/export or data analysis then you need to make sure that you have some provision in your code to track the expiry of the access token and the refresh the token when needed.

      Doing this in PowerShell

      If your using PowerShell you can use the same code as above as long as import the MSAL library dll into your session

      Some simple auth examples for this would be

       Delegate Authentication 


      $MailboxName="gscales@datarumble.com";
      $ClientId="9d5d77a6-fe09-473e-8931-958f15f1a96b"
      $scope="https://outlook.office.com/EWS.AccessAsUser.All";
      $redirectUri="msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
      $domainName="datarumble.com";
      $Scopes=New-ObjectSystem.Collections.Generic.List[string]
      $Scopes.Add($Scope)
      $TenantId=(Invoke-WebRequesthttps://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration|ConvertFrom-Json).token_endpoint.Split('/')[3]
      $pcaConfig=[Microsoft.Identity.Client.PublicClientApplicationBuilder]::Create($ClientId).WithTenantId($TenantId).WithRedirectUri($redirectUri)
      $TokenResult=$pcaConfig.Build().AcquireTokenInteractive($Scopes).WithPrompt([Microsoft.Identity.Client.Prompt]::Never).WithLoginHint($MailboxName).ExecuteAsync().Result;

      AppOnly Token


      $ClientId="9d5d77a6-fe09-473e-8931-958f15f1a96b"
      $MailboxName="gscales@datarumble.com"
      $RedirectUri="msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth"
      $ClientSecret="xxx";
      $Scope="https://outlook.office365.com/.default"
      $TenantId=(Invoke-WebRequesthttps://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration|ConvertFrom-Json).token_endpoint.Split('/')[3]
      $app=[Microsoft.Identity.Client.ConfidentialClientApplicationBuilder]::Create($ClientId).WithClientSecret($ClientSecret).WithTenantId($TenantId).WithRedirectUri($RedirectUri).Build()
      $Scopes=New-ObjectSystem.Collections.Generic.List[string]
      $Scopes.Add($Scope)
      $TokenResult=$app.AcquireTokenForClient($Scopes).ExecuteAsync().Result;





      Doing Mailbox Change discovery with an EWS PowerShell Script

      $
      0
      0
      Mailbox Change discovery is the process of looking at any folders or items that are new or have been modified recently in a Mailbox. Its useful in a number of different ways including (but not limited to)

      • Looking at what objects a third party Addin is creating or modifying in your mailbox
      • Help to work out which FAI (Folder Associated Item) is being modified when changes are made to the configuration in Outlook or Outlook on the Web (this can be useful if you then want to automate those changes in your own scripts)
      • Fixing client issues caused by corrupt or bad items (eg if you've ever used MFCMapi to delete and Item that's causing a particular client function not to work correctly)
      • Getting an understanding of how the backend scaffolding of new features work in Outlook on the Web (eg looking at what the substrate Is doing in Office365) 
      If you have ever looked recently at the Non_IPM Root folder of any Office365 Mailbox you can see by the large number of folders that are used by various different apps, substrate processes as well as for new client features there is a lot going on. So this script can also help give a bit of insight on what's happening in the background when you activate or use particular features (or potentially point you to the location in the Mailbox when your looking at problems that might be occurring with certain features)
        I'll go through a specific use case later looking at the "contact favourite feature (which I struggled to even find the UI documentation for)" which is what prompted me to write this script.

        What this script does

        The script has three main functions

        1. Enumerates every folder in the Mailbox (both IPM and NON_IPM_Subtree) as well as Search Folders and looks at the created and modified date of each folder. If they where created or modified in the lookbacktime then it's adds them to the report
        2. It then does a Scan of the Items in each Folder (excluding Search Folders) and if it finds any items that where modified or created after the lookbacktime then it adds these to the report
        3. It then does a Scan of the FAI Items (Folder Associated Items) in the Folder and again if the items where modified or created after the lookbacktime then it adds these to the report
        The Output of the Report then contains information about what folders, Items and FAI Items have either been created or modified in the Mailbox in the last x number of seconds.

        An Example

        The best way to demonstrate this is with an Example which was the reason I wrote the script, The Contact Favourite feature in Outlook on the Web gives you the ability to click the star next to a Contacts name in OWA which then creates a favourite Shortcut eg


        So I wanted to know when you did this where does the favourite item get created, what information it was storing and what other changes where happening. So this is where the following script comes into handy to find this information out all I needed to do was favourite a contact and then run the script immediately after to look at the items which changed in the Mailbox in the last 60 seconds. Eg a run of the script after I made the above change yielded a report that looked like



        So from the above report you can see that firstly a new Search Folder was created 
        \FavoritePersonas\Glen Scales_7d09f835-0028-4bd7-bed9-59535127bbe1NewSearchFolder



        under the \FavoritePersonas\ directory and also an object of type SDS.32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1.OutlookFavoriteItem was created under the folder

        \ApplicationDataRoot\32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1\outlookfavorites

        The other thing that I included in the report was the EntryId of the Item found so in the above case i can take the EntryId for the outlookfavourite and open the Item in a Mapi editor like OutlookSpy or MFCMAPI eg


        And you can then see all the MAPI properties on the Item (or delete/export etc)



        That's it relatively simple to use eg

        Invoke-MailboxChangeDiscovery -MailboxName mailbox@domain -secondstolookback 60

        I've put this script up on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/ChangeDiscovery.ps1

        How to test SMTP using Opportunistic TLS with Powershell and grab the public certificate a SMTP server is using

        $
        0
        0
        Most email services these day employ Opportunistic TLS when trying to send Messages which means that wherever possible the Messages will be encrypted rather then the plain text legacy of SMTP.  This method was defined in RFC 3207 "SMTP Service Extension for Secure SMTP over Transport Layer Security" and  there's a quite a good explanation of Opportunistic TLS on Wikipedia https://en.wikipedia.org/wiki/Opportunistic_TLS .  This is used for both Server to Server (eg MTA to MTA) and Client to server (Eg a Message client like Outlook which acts as a MSA) the later being generally Authenticated.

        Basically it allows you to have a normal plain text SMTP conversation that is then upgraded to TLS using the STARTTLS verb. Not all servers will support this verb so if its not supported then a message is just sent as Plain text. TLS relies on PKI certificates and the administrative issues that come around certificate management like expired certificates which is why I wrote this script. Essentially I wanted to see the Public Certificate that was in used by Recipient SMTP server and couldn't find any easy to use method to get it. Eg in a web browser you can always view a certificate to check its authenticity, but with SMTP there aren't a lot of good tools around for this, you can use Telnet to test in Plan text a SMTP server, but its not easy to retrieve the TLS public certificate from the server for inspection over Telnet (or using something like putty etc).

        In PowerShell this is pretty easy back in 2006 I wrote a plain text SMTP test script https://gsexdev.blogspot.com/2006/09/doing-smtp-telnet-test-with-powershell.html and a variant to do alerting on verbs https://gsexdev.blogspot.com/2007/06/testing-smtp-verbs-and-sending-alert.html  so this is more just a modern version of this with the addition of using the System.Net.Security.SslStream class that supports creating the TLS connection and also allows you to easily export the recipient servers Public cert eg

        Write-Host("STARTTLS")-ForegroundColorGreen
        $streamWriter.WriteLine("STARTTLS");
        $startTLSResponse=$streamReader.ReadLine();
        Write-Host($startTLSResponse)
        $ccCol=New-ObjectSystem.Security.Cryptography.X509Certificates.X509CertificateCollection
        $sslStream.AuthenticateAsClient($ServerName,$ccCol,[System.Security.Authentication.SslProtocols]::Tls12,$false);
        $Cert=$sslStream.RemoteCertificate.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert);
        [System.IO.File]::WriteAllBytes($CertificateFilePath,$Cert);

        I've created a small Powershell Script module that has two cmdlets the first is called Get-SMTPTLSCert which can be used to get the Public cert being used by the SMTP endpoint eg for Gmail you could use Get-SMTPTLSCert -ServerName smtp.gmail.com -Sendingdomain youdomain.com -CertificateFilePath c:\temp\gmailpubCer.cer By default this uses the client submission port 587 SMTP-MSA (Port 25 is often blocked from most locations) so its testing client(Message Submission Agent) to server (rather then server to server between to SMTP Mesage Transfer Agents). The Sending domain is required becuase most SMTP servers don't allows a empty helo/ehlo statement. I've also included a cmldet "Invoke-TestSMTPTLS" that does a test of the SMTP server and does Authentication if necessary. eg usually to use Port 587 you need to authenticate on the SMTP server. So in this instance once you have used the STARTTLS verb to upgrade the Plain text conversation to TLS you can then use the AUTH LOGIN verb to submit the username and password as base64 strings to authenticate eg this is what the Auth looks like in script
        $command="AUTH LOGIN"
        write-host-foregroundcolorDarkGreen$command
        $SSLstreamWriter.WriteLine($command)
        $AuthLoginResponse=$SSLstreamReader.ReadLine()
        write-host($AuthLoginResponse)
        $Bytes=[System.Text.Encoding]::ASCII.GetBytes($Credentials.UserName)
        $Base64UserName=[Convert]::ToBase64String($Bytes)
        $SSLstreamWriter.WriteLine($Base64UserName)
        $UserNameResponse=$SSLstreamReader.ReadLine()
        write-host($UserNameResponse)
        $Bytes=[System.Text.Encoding]::ASCII.GetBytes($Credentials.GetNetworkCredential().password.ToString())
        $Base64Password=[Convert]::ToBase64String($Bytes)
        $SSLstreamWriter.WriteLine($Base64Password)
        $PassWordResponse=$SSLstreamReader.ReadLine()
        write-host$PassWordResponse

        So the above code takes a PSCredential object and changes that into necessary SMTP verbs to authenticate.  So run against office365 this looks like


        (The base64 values above decode to Useraname: and Password: )

        This cmdlet doesn't actually send a Message it just invokes the envelope verbs and no Data verb is sent (where the MIME message would go). I've put a copy of the script on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/TLS-SMTPMod.ps1


        Creating a year at a glance Calendar (in Excel) from aggregated Shared Calendars in Exchange Online within a Microsoft Teams Tab app using the Microsoft Graph

        $
        0
        0
        Calendaring formats in Messaging Clients tend to all follow much the same approach whether its Outlook, Outlook on the Web or Mobile, Gmail or Microsoft Teams.  Like email data, calendar data can be random and complex in its volume and type (recurring appointments etc) so a simple year at a glance calendar for someone designing a mass market client is hard to do well for all the data types and volumes that you could encounter, therefor its not something you see in mail clients by default (lets face it who wants to support that).

        In the absence of year at a glance calendars  I was surprised to see people using Excel to create yearly aggregated calendars in Microsoft Teams for events (for data that already existed in shared Calendars). But more surprisingly is that it actually kind of worked well when there wasn't a lot of data that needed to be shown. The one thing that sprang to my mind was if you could automate this it  would be really good for people who use the Birthday calendar feature in Outlook, simple Company events calendars and also public holidays calendars especially when you want to aggregate multiple countries public holidays in a simple spreadsheet to help people like me who work across multiple regions and then share that within a team.

        So I thought I'd set out to build a simple Microsoft Teams Tab application that could create an aggregated Spreadsheet of events from any calendar (or calendars) in a Office365 Mailbox that was shared to a Particular Microsoft Teams (Group) using the Graph API to get the Calendar Data from the Mailboxes and also using the Graph API to build the Excel workbook using the workbook functionality that the graph has. The result is then stored in a OneDrive File and provided back to the user in a iFrame as and embedded Excel Online spreadsheet. And the end result looks something like this (this is the result of having a Shared Mailbox with the Holiday calendars added/imported for Australia, US and the UK and that Mailbox being Shared to the Group/Teams)


        How it works

        Like the other Teams Tab apps I've written it takes advantage of using the Teams tab silent Auth method documented here . Once the code has acquired an Access Token to access the Graph it can get to work.

        Configuration 

        For this application to work I needed to be able to store the configuration of the calendars I wanted to aggregate . As the app is written in JS the easiest form of config file was a straight JSON file like the following
        {
        "Calendars":[
        {
        "CalendarEmailAddress":"mb1@datarumble.com",
        "CalendarName":"Australia holidays",
        "CalendarDisplayName":"Australia"
        },
        {
        "CalendarEmailAddress":"mb1@datarumble.com",
        "CalendarName":"United States holidays",
        "CalendarDisplayName":"United States"
        }
        ]
        }


        And then I just required a way of storing and retrieving the file (a todo would be to create a nice form to allow people to create and edit the config but if I had time ...). The Teams client Sdk (and tab apps) don't have any provision for storing custom configuration, properties or pretty much anything configuration related so I just went for putting the file in the Channel document library as a starting point. So next I just needed some Graph code to grab the contents of that file. In JS the easiest way i found to do this was like this

        From the Teams Context interface you can get the GroupId and ChannelName where you tab is executing so you can the construct the following URL that can be used in the Get against the MS Graph.

        v1.0/groups/" + GroupId + "/drive/root:/" + channelName + "/ExcelCalendarConfig.json

        The Graph documentation points to using the /content  endpoint to download the contents of a file, I have used this before in .NET (and node.js) and it works okay, it returns a 302 response with a Location header that can be followed to the SharePoint site. In client side JS its a lot messier so I found it easier to do this

        CCDriveItem = await GenericGraphGet(Token,CalendarConfigURL);
        var CCFetch = await fetch(CCDriveItem["@microsoft.graph.downloadUrl"]);

        So the @microsoft.graph.downloadUrl is a short-lived URL for the file that doesn't need authentication. So its easy to just do a Get and then use fetch on this url to return the JSON back to the code and I don't have to wade through a bunch of URL follow and cors issues with ajax and fetch

        Template

        One of the things that the Graph API can't do is create a new Excel file from scratch so you have to have an existing file you want to create a session with or some people recommend a number of different libraries to create the file. An easy solution for this one for me was to create a blank Excel file with no metadata and include that in with the webfiles so I could just copy it to OneDrive as a template file (overwriting any existing older file that may have been there) and then use that.

        Storing the Result File 

        One other problem for this project was where to store the end result file, at first I just used the SharePoint library associated with the Teams Channel but there where problems around the file becoming locked easily if two people ran it simultaneously. I also wanted to be able to run this with the least amount of permission as possible so the users App Folder (for this Tab app) seemed like the best spot as a starting point which is what the following code handles.


        let AppDrive = await GenericGraphGet(Token,"https://graph.microsoft.com/v1.0/me/drive/special/approot");
        let FileData = await ReadTemplate();
        var fileName ="Calendars.xlsx";
        var UploadURL ="https://graph.microsoft.com/v1.0/me/drive/special/approot:/"+ fileName +":/content";
        let NewFile = await CreateOneDriveFile(Token,UploadURL,FileData);

        Getting the Calendars

        Getting the Calendars was probably the easiest task, from the config file the CalendarName property is used to find the Folder from the Mailbox you want to access the data from. The query of the Calendar is then done for a the current years data using a CalendarView (which will expand any recurring calendar appointments). To aggregate the calendar data that was retrieved into orderable lists I used multiple Map objects in JS,loop iterations and arrays so I get an ordered list of events that are aggregated by first the Month and then day within the Month.

        Building the Spreadsheet 

        To build the spreadsheet in the output format that I wanted (which mirrored what I saw users doing manually) I had to first insert the data, then merge the month rows so I only had 1 row per month. Then format the merge so the text was aligned correctly and had the correct formatting. And then lastly was to Autofit the columns so the spreadsheet displayed correctly to users. So this required a lot of separate request to the Graph API to do which at first ran a little slowly. Then came Batching

        Batching

        Batching really is a Godsend when it comes to performance with a task like this, for example my original code had around 40-50 individual request to get the data and formatting done and with batching it was reduced to around 6 (and I was being a little conservative and could have reduced this). The big tip for using batching with the WorkBook endpoint is that you need to make sure you include the workbook session id with ever request (just not the batch request). If you don't you will get a lot of EditModeCannotAcquireLockTooManyRequests  which the documentation,the error (and the internet in general) aren't really helpful in pointing out why this happened.

        Displaying it back to the Teams tab

        This turned out to be one of the hardest problems to solve and is one of the outstanding issues with this in Teams anyway. I used an Iframe and generated and embeed link (which is what you get when you use Share-embed in Excel Online). This does work okay in the browser as long as you already have a login to your personal OneDrive (token in the Cache) else you will be prompted to logon to SharePoint. In the Desktop client this logon is a problem so instead of opening within the Tab in the desktop client, if it detects the Desktop client it launchs a new browser tab (which you may or may not need to logon to SharePoint to view).  This was a little disappointing but probably something I'll have a fix for soon (If anybody has any suggestions I'm all ears)

        GitHub Repo for this App

        I have a hosted version of this Tab App on my GitHub pages on https://gscales.github.io/TeamsExcelCalendar/  and there is repo version inside the Aggregation engine repo https://github.com/gscales/ExcelCalendarAggregate/tree/master/TeamsExcelCalendar with a Readme that details the installation process

        Building on the Aggregation engine

        Because I kind of enjoy taking things and running with them I have some plans of using the Calendar to Excel aggregation engine in a few different formats. The first will be a Simple powershell script so you can do the same thing but all from with an Automation context so if your interested in this but don't want a Teams tab app watch this space.


           

        Update to ExchangeContacts Module to support Modern Auth,Exporting all Contacts to a VCF file (or CSV) ,NON_IPM root folder,hidden contact folders and dumpster exports

        $
        0
        0
        I've done some updating of my ExchangeContacts PowerShell module to support the following

        1. Modern Authentication in Office365 (distributing the ADAL dll with this module)
        2. Compiled and distributed the latest version of the EWS Managed API from GitHub with the module
        3. New cmdlet Export-EXCContacts that supports exporting all contacts in a Folder to a single VCF File
        4. New cmdlet Export-EXCContacts that supports exporting all contacts to a CSV file (this was already possible with the ExportFolder cmdlet but this is a slightly enhanced format)
        5. New cmldet Export-EXCRootContacts lets you export the Non_IPM Subtree folders that contain contacts. (Some of these are created by the Office365 substrate process) for example mycontacts, AllContacts, ContactSearch folders etc. Include dedup code based on Email Address in this cmdlet
        6. This is already supported but I wanted to show how you can export the Hidden Contacts Folder likes Recipient Cache, Gal and Organizational Contacts
        7. New cmdlet Get-EXCDumpsterContacts get the contacts that are in the RecoverableItems Deletions or Purges Folder
        8. New cmdlet Export-EXCDumpsterContacts Exports the contacts that are in the RecoverableItems Deletions or Purges Folder to a single VCF or csv file

        Using Modern Authentication

        As Basic Authentication in EWS is going away soon in Office365 I've enabled Modern Auth for this module using the ADAL dll which gets distributed via the bin directory in the Module. I didn't enabled it by default because it would cause issues with OnPrem Exchange so to use Modern Auth you just need to use the -ModernAuth switch. You can still pass in the PSCredential object with the -ModernAuth switch and oAuth will still be used vai the username and password grant to allow for silent auth. There is also provision to pass in your own client id for custom app registrations with the -ClientId parameter eg a simple example for using ModernAuth is 



        Get-EXCContact -MailboxName gscales@datarumble.com
        -EmailAddress what@what.com -ModernAuth
        Export-EXCContacts

        Export-EXCContacts supports exporting all the contacts from any folder in a Mailbox to a single VCF file or a CSV File. (EWS provides the VCF nativly for Mailbox contacts so this cmlet hanldes streaming them out to a single file). Eg here are some examples

        Exporting to a single VCF

        Export-EXCContacts -Folder "\Contacts" -MailboxName gscales@datarumble.com
        -ModernAuth -FileName c:\temp\exp.vcf
        or to a CSV


        Export-EXCContacts -Folder "\Contacts" -MailboxName gscales@datarumble.com
        -ModernAuth -FileName c:\temp\exp.csv -ExportAsCSV
        Export-EXCRootContacts

        Export-EXCRootContacts supports exporting contacts from the NON_IPM_Subtree folders in a Mailbox. Typically folders here are created by either a Client like Outlook or OWA, other Office365 substrate process (eg Microsoft Teams) or other third party apps where they want the data to be hidden from the user. Examples of these folders would be Allcontacts, mycontacts etc. I've added this more for educational and diag purposes and Included some dedup code to deduplicate exports based on the EmailAddress. An example of export the AllContacts Folder to a CSV file


        Export-EXCRootContacts -MailboxName gscales@datarumble.com -FolderName AllContacts -FileName c:\temp\allContacts.csv
        -ExportAsCSV -ModernAuth

        Get-EXCDumpsterContacts


        This cmdlet will query either the RecoverableItemsDeletions or RecoverableItemsPurges folders in a Mailbox (Dumpster v2 folder) and it will get any contacts that exist in these folders and return them as EWSContact objects. (You can then process them further eg copy,move etc)

        eg


        Get-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ModernAuth -Purges
        or purges




        Get-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ModernAuth 
        Export-EXCDumpsterContacts

        This cmdlet builds on Get-EXCDumpsterContacts and allows you to export what is returned to either a single VCF file or a CSV file. (same logic as Export-EXCContacts)



        Export-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ExportAsCSV
        -FileName c:\temp\dumpsterDeletions.csv
        or purges


        Export-EXCDumpsterContacts -MailboxName gscales@datarumble.com -purges -ExportAsCSV
        -FileName c:\temp\dumpsterPurges.csv


        Exporting  hidden Contacts folders

        One last thing I wanted to demonstrate with this module is the ability to export the Hidden contact folders in your mailbox, if you have ever peeked at the Contacts folder subfolder hierarchy in a MAPI editor like MFCmapi there are a number of Hidden folders eg


        Folders like Recipient Cache, Gal Contacts and Organizational Contacts folder all serve different client specific tasks (that do go wrong sometimes). So you can use this module to export the contacts in these folders to a CSV for any troubleshooting, migration or personal interest needs.

        Here are some examples of exporting contacts from those folders to a csv file


        Export-EXCContacts -Folder "\Contacts\Organizational Contacts"
        -MailboxName gscales@datarumble.com -ModernAuth -FileName c:\temp\exp.csv
        -ExportAsCSV

        The new module can be found on the Powershell Gallery https://www.powershellgallery.com/packages/ExchangeContacts/1.6.0.0 and the source is available here on GitHub https://github.com/gscales/Powershell-Scripts/tree/master/EWSContacts/Module

        Using Azure device code authentication on a arduino iot 33 and getting the Teams presence from the Microsoft Graph

        $
        0
        0
        A while ago I published this post on accessing the Graph directly from an Arduino, this made use of the "resource owner password credentials grant" (meaning it used a hard coded username and password). Once you have enabled MFA (multi factor authentication) on an account this grant no longer works because you have no ability to provide the other factors for the Authentication to succeed.  For devices like Arduino's or most IOT devices that have very limited UI capabilities this is where device code authentication can be used.

        The way Device Code Authentication works is instead of posting the user credentials to the token endpoint to get an access token, you make a post first to the /v2.0/devicecode endpoint which will then give you a specific user code to use to authenticate with on another device. You then visit http://microsoft.com/devicelogin (on a pc or mobile device) enter the user code and authenticate as the required user doing any extra MFA authentication. In the meantime the limited UI device polls the Token Endpoint and once authentication has been completed(on the external device) instead of the endpoint returning a pending error the poll results will be a normal Access token (and refresh token) that can then be used to access any Graph resources you have access to.

        Visually on the Serial port here is what the whole process looks like on the Arduino

        The last part of this code makes a request to get the Presence from Microsoft Teams which was introduced into beta in the Microsoft Graph in December see https://docs.microsoft.com/en-us/graph/api/resources/presence?view=graph-rest-beta.

        So putting this all together you can make a simple Teams presence light with a circuit like (circuit is for demonstration purposes only)


        and processing the Presence result you can get returned from the Graph using the code I've referenced below

        A few notes on Device code Authentication, its important when you setup your App Registration in the Azure Portal that you mark your registration as public "Treat application as a public client" eg



        Device code requests must be made against the Tenant endpoint (so you can't use the common endpoint). In the code I've included discovery code that gets the tenant specific endpoint to use based on the domain name stored in the Secrets file.

        Also if your reading this because your following the documentation for Device code on https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-oauth2-device-code and you can't get it to work there is an issue with the payload information in the document. Where device_code is used as a parameter name in the payload in the documentation it should just be code with your device code as the value.

        I've put the sketch which contains the code I've used for Device Code authentication and grabbing the presence from the Microsoft Graph on GitHub here https://github.com/gscales/MS-Graph-Arduino/tree/master/MSGraph-Presence please refer to my previous article on details on getting you code up and running on an Arduino Iot33 which include downloading the SSL certs to the device which is required (also flash the firmware).

        A couple of notes on the code because the Json parsing library I used can't handle the access token response I needed to manually parse the token out (which is a little frustrating) but is one of the chanllendges of working with Arduino's and dealing with the issues that limited memory causes. 

        Export calendar Items to a CSV file using Microsoft Graph and Powershell

        $
        0
        0
        For the last couple of years the most constantly popular post by number of views on this blog has been Export calendar Items to a CSV file using EWS and Powershell closely followed by the contact exports scripts. It goes to show this is just a perennial issue that exists around Mail servers, I think the first VBS script I wrote to do this type of thing was late 90's against Exchange 5.5 using cdo 1.2.

        Now it's 2020 and if your running Office365 you should really be using the Microsoft Graph API to do this. So what I've done is create a PowerShell Module (and I made it a one file script for those that are more comfortable with that format) that's a port of the EWS script above that is so popular. This script uses the ADAL library for Modern Authentication (which if you grab the library from the PowerShell gallery will come down with the module). Most EWS properties map one to one with the Graph and the Graph actually provides better information on recurrences then EWS did. Where extended properties where used in the EWS script the equivalent is used in the Graph. (The only real difference is the AppointmentState property which is a strongly typed property in EWS but I had to use the Extended property in the Graph).

        Just a couple of things if your new to Microsoft Graph scripts and Modern Authentication that you need to know

        1. You need an Approved Azure Application registration to use this (or any script that is going to access the Graph). The Microsoft walk-throughs https://docs.microsoft.com/en-us/graph/auth-register-app-v2 are pretty good at describing how to do this. Specific config I recommend you use

        "https://login.microsoftonline.com/common/oauth2/nativeclient" as the redirectURL (this is part of the Suggested Redirect URIs for public clients (mobile, desktop)).

        2. Permission for the above



        You only need the following permissions for this script to work, Calendar.Read gives you rights to the calendar the account that is being used and Calendar.Read.Shared gives you read access to any calendars that the account being used has been granted access to (eg via delegation, admin portal or add-mailboxpermission). 

        Then you just need to copy the Application (client) ID guid from the overview screen in the Applicaiton Registration  and use that as in the -clientId paraemter in the Export-GCECalendarToCSV cmdlet.

        I've included a demo multi tenant app registration as the default in the module that just has these rights which you can use for testing but I would always recommend you create you own.

        You can install the module which will give you access to the Export-GCECalendarToCSV and Export-GCECalendar cmdlets from the Powershell gallery https://www.powershellgallery.com/packages/MSGraph-ExportCalendar/  (see the instruction on that page).

        Or if you want to take the script and modify it yourself its located on GitHub https://github.com/gscales/Powershell-Scripts/blob/master/MSGraph-ExportCalendar/functions/Export-GCECalendarToCSV.ps1

        Simple example of exporting the last 7 days of calendar appointment to csv

        Export-GCECalendarToCSV -MailboxName gscales@datarumble.com -StartTime (Get-Date).AddDays(-7) -EndTime (Get-Date) -FileName c:\temp\Last7.csv



        Automating opening a Search-Mailbox result in Excel using EWS

        $
        0
        0
        While the Search-Mailbox cmdlet is now depreciated in Exchange Online, OnPrem its still used a fair bit and also does still have some use in the cloud for specific tasks. I've been using it this week a fair bit for various testing tasks and one pain I found when doing a lot of repeated searches in logging mode is each time to have to go in, open the results message in the discovery search mailbox and download the attachment with the log file, unzip and open it in Excel. So I came up with a way of automating this in powershell which turned out to be pretty simple but effective.

        First off the only information you need to get the Results Message gets returned in the Target Folder property of the Search results eg.


         The TargetFolder value tells you what folder in the discovery Search mailbox the results are stored in and the DateTime value that will be in the subject of the Results Message.

        So in EWS you can use FindFolder to Find that Folder (using a Split on "\" which will work as long as you don't put that in the displayName) and then FindItem can be used to find the results Item eg.

        $ivItemView = New-Object Microsoft.Exchange.WebServices.Data.ItemView(1)  
        $SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+ContainsSubstring([Microsoft.Exchange.WebServices.Data.ItemSchema]::Subject, $Subject)
        $findItemResults = $Folder.FindItems($SfSearchFilter, $ivItemView)
        if ($findItemResults.Items.Count -eq 1) {
        return$findItemResults.Items[0]
        }
        else {
        throw"No Item found"
        }

        Once you have the Results Message you can download the Attachment using some code like this

        if ($SearchResultItem.HasAttachments) {
        $SearchResultItem.Load();
        foreach ($Attachment in$SearchResultItem.Attachments) {
        $Attachment.Load()

        I then save it to the default downloads directory using


        $downloadDirectory = (New-Object -ComObject Shell.Application).NameSpace('shell:Downloads').Self.Path
        $fileName = ($downloadDirectory + "\" + $ItemPath.SubString(1).Replace("/", "-").Replace(":", "-") + "-" + $Attachment.Name.ToString())
        $fiFile = new-object System.IO.FileStream($fileName, [System.IO.FileMode]::Create)
        $fiFile.Write($Attachment.Content, 0, $Attachment.Content.Length)
        $fiFile.Close()

        and finally open the ZipFile, Extract the csv datastream from the Archive and save that as a File in the Downloads Directory and then open that file in Excel.


        if ($FileName.contains(".zip")) {                    
        $Zip = [System.IO.Compression.ZipFile]::OpenRead($FileName)
        try {
        foreach ($file in$Zip.Entries) {
        if ($file.Name.contains(".csv")) {
        $ms = New-Object System.IO.MemoryStream
        $ZipStream = $file.Open()
        $ZipStream.CopyTo($ms);
        $outputfile = $FileName.replace("zip", "")
        [System.IO.File]::WriteAllBytes($outputfile, $ms.ToArray())
        Invoke-Item $outputfile

        }
        }
        }
        catch {
        Write-Host $_.ScriptStackTrace
        }
        $Zip.Dispose()

        An example of using this is

        $SearchResult = Search-Mailbox -id meganb -TargetFolder Search1 -SearchQuery from:glen -TargetMailbox "DiscoverySearchMailbox{D919BA05-46A6-415f-80AD-7E09334BB852}@xxx.onmicrosoft.com" -LogOnly -LogLevel Full

        then

        Get-SearchMailboxResultsToExcel -MailboxName "DiscoverySearchMailbox{D919BA05-46A6-415f-80AD-7E09334BB852}@M365x680608.onmicrosoft.com" -SearchResultPath $SearchResult.TargetFolder -Verbose

        or if you want to use ModerAuth (You will need the Adal dll in the same directory)

        Get-SearchMailboxResultsToExcel -MailboxName "DiscoverySearchMailbox{D919BA05-46A6-415f-80AD-7E09334BB852}@M365x680608.onmicrosoft.com" -ModernAuth -SearchResultPath $SearchResult.TargetFolder -Verbose

        I've put a download of this script on GitHub https://github.com/gscales/Powershell-Scripts/blob/master/Get-SearchMailboxResultsToExcel.ps1


        Migrating your Mailbox searches in EWS to the Graph API Part 1 Filters and Search Folders

        $
        0
        0
        This is part one of a two part post where I'm going to look at how you can migrate any searches you are doing in EWS to the Graph API. In this first part I'm going to cover SearchFilters (from EWS) and Search-Folders as they have been around the longest and in part 2 I'll look at Searches which has  some new functionality in beta in the Graph.

        Lets start by looking at how you might be doing searches in EWS at the moment

        • Search Filters (restrictions) in a FindItem Request that can be run against a Folder or Search Folder
        • QueryString (KQL) in a FindItem Request that can be run against a Folder or Search Folder
        • SearchFolder with a FindItem Request
        • eDiscovery via SearchMailbox which has now been depreciated in Office 365 and no longer supported
        Search Filters (Restrictions)

        If you have used the EWS Managed API to build your application you use the SearchFilter class which creates a underlying restriction in EWS https://docs.microsoft.com/en-us/exchange/client-developer/web-service-reference/restriction.  The term Restriction came from the Exchange ROP's protocol which is what MAPI uses to talk to the Exchange Store.

        In the Microsoft Graph the language you talk in regards to filtering is OData  
        OData (Open Data Protocol) is an ISO/IEC approved, OASIS standard that defines a set of best practices for building and consuming RESTful APIs ref https://www.odata.org/
        OData filters are therefore a standard that anybody implementing the protocol (like Microsoft have done with the Graph API) should adhere to.

        Lets look at some examples (from this blog) of SearchFilters I've used and how they can be converted to Graph oData Filters.

        Easy - the easiest query to make is against one of the strongly typed properties like Subject or Sender eg in EWS you might have a search filter like this

        SearchFilter SubjectFilter = new SearchFilter.IsEqualTo(ItemSchema.Subject,"Subject");

        in the Graph this would just look something like this (applied just against the Inbox)

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?$filter=subject eq 'test'

        This is an Equality search some other things you can do is

        Startswith (which you couldn't actually do in EWS)

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?$filter=startswith(Subject,'test')

        Sub-String Searches

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?$filter=Contains(Subject,'test')

        With the later two searches if you have a folder with a large item count then these aren't going to perform like an equality or a Content Index Search would and its a possibility that they will timeout the first time you run the query. (Then on subsequent queries may succeed this is due to the way Exchange applies temporary restrictions for dynamic searches).

        Medium - The most used search filter in EWS for me is that date restriction where you want to restrict the emails returned to a certain time frame so an EWS Search Filter like the following

        $Sfgt = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsGreaterThan
        ([Microsoft.Exchange.WebServices.Data.ItemSchema]::DateTimeReceived, $Startdatetime)
        $Sflt = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsLessThan
        (
        [Microsoft.Exchange.WebServices.Data.ItemSchema]::DateTimeReceived, $Enddatetime)
        $sfCollection = new-object Microsoft.Exchange.WebServices.Data.SearchFilter
        +SearchFilterCollection([Microsoft.Exchange.WebServices.Data.LogicalOperator]::And);
        $sfCollection.add($Sfgt)
        $sfCollection.add($Sflt)


        In the Graph this would look like

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $filter=(receivedDateTime gt 2020-03-24T13:01:50Z) AND (receivedDateTime lt 2020-03-25T12:59:50Z)


        (Just watch the date formatting)

        Hard - If your using Extended properties in your SearchFilters then you need to include the Extended property definition in the Filter and use the lambda any or all expression

        The first Query looks for Items based on the underlying Message Class (which isn't exposed as a strongly typed property in the Graph)

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $filter=singleValueExtendedProperties/any(ep:ep/id eq 'String 0x001a' and ep/value eq 'IPM.Note')


        Another useful filter is to be able to find Messages where a particular property has been set eg this filter looks for messages that have the In-Reply-To header set (So Repsonses only)

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $filter=singleValueExtendedProperties/any(ep: ep/id eq 'String 0x1042' and ep/value ne null)


        For non String properties for instance the Message size you need to make sure you cast the value to the Json datatype. Eg to find all messages that are larger the 10 MB you could use

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $filter=singleValueExtendedProperties/any(ep:ep/id eq 'Integer 0x0E08' and cast(ep/value, Edm.Int32) gt 1048576)


        In EWS when you did a search that returns a large number of Items that was paged you got back also the total number of items that matched your search query. I've used this in the past to get statistical information about email with a filter without needing to page all the results. The Graph offers the same thing using the $count query parameter eg if I change the above query to find all messages above 1 MB in my mailbox and include the $count parameter this is what I get for my mailbox



        Another example of this maybe to look at the count of messages in 2018

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $count=true&$filter=(receivedDateTime gt 2018-01-01) AND (receivedDateTime lt 2019-01-01)


        And finally one last example is for pidTagHasAttachments because I know myself and others use this (because it gives a different value than the strongly type hasAttachments for various reasons)

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $filter=singleValueExtendedProperties/any(ep:ep/id eq 'Boolean 0x0E1B' and cast(ep/value, Edm.Boolean) eq true)

        Hopefully I've covered off enough examples here for anybody stuck with syntax to be able to get their head around it. If you do have problems try posting a question into stack overflow here

        SearchFolders

        SearchFolders give you a way of creating a Virtual Folder that represent an underlying restriction (or Search) that can span one folder or the whole mailbox. While they are more suited to static type searches if you have ever used the mapi fiddler inspector https://github.com/OfficeDev/Office-Inspectors-for-Fiddler to look at what Outlook is doing under the covers when you do a Search, you can see that Outlook uses Searchfolders to provide a more functional search for dynamic queries.

        Another example that is used in the Microsoft graph is the me/Messages endpoint which is a Searchfolder that provides access to all the mail folders in a mailbox.

        In EWS when you create a SearchFolder you specify a SearchFilter for that folder to be based on. With the Graph similarly you can create a SearchFolder based on a OData filter which I've detailed above. So looking at something topically if you wanted to create a SearchFolder to show all the Email which had a subject of Coronavirus you could use

        {
          "@odata.type": "microsoft.graph.mailSearchFolder",
          "displayName": "Coronavirus Email",
          "includeNestedFolders": true,
          "sourceFolderIds": ["AQMkADYA…."],
          "filterQuery": "contains(subject, 'Coronavirus')"
        }

        One thing that is mentioned in the SearchFolder documentation https://docs.microsoft.com/en-us/graph/api/resources/mailsearchfolder?view=graph-rest-1.0 for the Graph to be aware of is

        1. Search folders expire after 45 days of no usage.
        2. There are limits on the number of search folders that can be created per source folder. When this limit is breached, older search folders are deleted to make way for new ones.
        So if you are going to use SearchFolders in your application you will need to make sure you have some appropriate management logic. Searchfolders are pretty powerful like EWS the Graph only implements a subset of what can be done in MAPI so if you are trying to reproduce what you see is possible in Outlook you may not be able to do this with Graph (or EWS).

        Migrating your Mailbox searches in EWS to the Graph API Part 2 KQL and new search endpoints

        $
        0
        0
        This is part 2 of my blog post on migrating EWS Search to the Graph API, in this part I'm going to be looking at using KQL Searches and using the new Microsoft Search API (currently in Beta). The big advantage these type of searches have over using SearchFilters is that these type of searches use the content indexes which can improve the performance of searches when folder item counts get high. They also allow you to query the contents of  Attachments which are indexed through ifilters on the server.

        KQL queries on the Mailbox and Mailbox Folders

        In EWS you have been able to use firstly AQS and now KQL in the FindItems operation from Exchange 2013 up. To migrate these searches to Microsoft Graph is pretty simple eg an EWS FindItem query to search for all messages with a pdf attachment

        FindItemsResults fiItems = service.FindItems(QueryFolder,"Attachmentnames:.pdf", iv);

        in the Graph you would use something like

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages
        ?$search="attachmentnames:.pdf"


        the slightly disappointing thing with the Graph is that you can't use count along with a search which when your doing statistical type queries eg say I wanted to know how many email that where received in 2019 had a pdf attachment makes this very painful to do in the Graph where in EWS it can be done with one call (its a real snowball that one).

        Searching the recipient fields like To and CC, in the forums you see some absolute clangers search filters that try to search the recipients and from fields of messages that can easily be done using the participants keyword which includes all the people fields in an email message. These fields are From, To, Cc. The one thing to be aware of is the following note on expansion in https://docs.microsoft.com/en-us/microsoft-365/compliance/keyword-queries-and-search-conditions?view=o365-worldwide . So if you don't want expansion to happen you need to ensure you use the wildcard character after the participant your searching for. A simple participants query looks like

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $search="participants:Fred"


        Date range queries

        One of the good things about KQL with dates is that you can use reserved keywords like today,yesterday,this week eg

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')
        /messages?$search="received:yesterday"


        to get all the received sent between two dates you can use either

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $search="received:2019-01-01...2019-02-01"


        or

        https://graph.microsoft.com/v1.0/me/mailFolders('Inbox')/messages?
        $search="(received>=2019-01-01 AND received<=2019-02-01)"


        If you want to search the whole of the Mailbox using the graph eg if you have use the AllItems Search Folder in EWS to do a Search that spans all the MailFolders in a Mailbox in the Graph you just need to use the /Messages endpoint eg

        https://graph.microsoft.com/v1.0/me/messages?
        $search="(received>=2019-01-01 AND received<=2019-02-01)"



        New Search Methods

        The traditional search methods in EWS give you the normal narrow refiner search outputs that most mail apps have been providing over the past 10-20 years. While these methods have improved over the years there hasn't been any real great leaps in functionality with Search. So the Microsoft Graph has been adding some newer endpoints that do allow a more modern approach to searching . The first is Microsoft Graph data connect which has been around for a while now and the Microsoft Search API which is still in Beta. As this article is about migrating EWS searches you probably wouldn't consider either of these for your traditional search migration as $filter and $search are going to meet those needs. However if you are looking at overhauling the search functionality in your application or you are building greenfield functionality then both of these new methods are worth consideration.



        Graph Data connect is your go-to endpoint when you want to do any mass processing of Mailbox data. It solves that problem of having to crawl every item in a Mailbox when you want to do any data-mining type operations by basically providing an Azure dataset of this information for you. Data connect is great however it has a high entry level, first you need a Workplace analytics licence for every mailbox you wish to analyse and the costs can mount pretty quickly the larger the Mailbox count your dealing with. The other requirements is paying for the underlying Azure Storage etc that your dataset ends up consuming. I think it can be a bit of a shame that the licencing costs can lock a lot of  developers out of using this feature as it really does provide a great way or working with Mail item data. And it leaves some having to resort to doing their own crawling of Mailbox data to avoid these costs (eg that licencing cost is a pretty hard sell for any startup looking to use this) 


        Microsoft Search API

        https://docs.microsoft.com/en-us/graph/search-concept-overview

        This is the newest way of searching mailbox data, while the underlying mechanism for doing mailbox searches is still KQL so its very similar to the $Search method described about,  this API does enhance the search results with some more "Search Intelligence" like relevance bringing AI into the picture . One of the other main benefits of this endpoint is when you want to broaden your search to other Office365 workflows or even include your own custom data searches. So this really is the endpoint that will provide you with a modern search experience/workflow. Which is getting more critical due to the sheer amount of data we have (eg the datageddon). Its still in beta and is a little restricted at the moment eg

        • It can't be used to search delegate Mailboxes so only the primary mailbox 
        • It only returns the pageCount for items not the Total number of Items found in a search (to be fair $search does this as well which is really annoying)
        • Searches are scoped across the entire mailbox 
        • Just Messages and Events are searchable at the moment






        Graph Mailbox Basics with PowerShell Part 1 Folders

        $
        0
        0
        I haven't done a basics series for a while but based on some of the questions I've been getting lately and the lack of some good Mailbox specific examples for basic but more complex tasks using the Graph against Exchange Online Mailboxes this seemed like a good series to write.

        For all the scripts in this series I'm not going to use any modules or other libraries so everything will be using Invoke-WebRequest and Invoke-RestMethod, while there is nothing wrong with using libraries or modules and a number of advantages in doing so it just keeps the examples as simple and easy to understand as they can be.

        Authentication You can't have an article on the Graph without talking about authentication and we are now far from the past where all you needed was a simple username and password and you where off to the races. The basics of Authentication are is that first you will need an Azure App Registration (that has been consented to), there are many pages dedicated to how you can do this  (this is one of the better ones) so I'm not going to dwell too much on this. My simple template script has a function called Get-AccessTokenForGraph which takes a ClientId and RedirectURI and does an interactive login to get the Azure access token. With oAuth there are many other ways of authenticating so if this doesn't fit your needs you just need to plug your own code in the Get-AccessTokenForGraph function.

        Get-FolderFromPath

        With Exchange the locator (think file path as an analogy) you use to access a Folder programatically is its FolderId. Every Exchange API has it own interpretation of the FolderId starting with the Fid and PidTagEntryId in Mapi, EWS has the EWSid and Graph just has the Id (and the EMS gives a combination of Id's back depending on which cmdlet you use). With the Graph and EWS id's these id's contain the PidTagEntryId with a bunch of other flags that tell the service how to locate and open the folder. However most of the time us humans think of folders in terms of Paths eg if I have a Subfolder of the Inbox a more human reference would be \Inbox\subfolder (language differences aside). So one of the more common methods I use is the Get-FolderFromPath to get a folder (or just the folderid) so you can then work on the Items within that folder or the folder itself. So the method I've always used in EWS is to take the path you want to search for and split in based on the \ character and then do a number of shallow searches of the parent folders until you find the child folder you want. in the Graph this looks something like this

                $RequestURL = $EndPoint + "('$MailboxName')/MailFolders('MsgFolderRoot')/childfolders?"
                $fldArray = $FolderPath.Split("\")
                $PropList = @()
                $FolderSizeProp = Get-TaggedProperty -DataType "Long" -Id "0x66b3"
                $EntryId = Get-TaggedProperty -DataType "Binary" -Id "0xfff"
                $PropList += $FolderSizeProp 
                $PropList += $EntryId
                $Props = Get-ExtendedPropList -PropertyList $PropList 
                $RequestURL += "`$expand=SingleValueExtendedProperties(`$filter=" + $Props + ")"
                #Loop through the Split Array and do a Search for each level of folder 
                for ($lint = 1; $lint -lt $fldArray.Length; $lint++) {
                    #Perform search based on the displayname of each folder level
                    $FolderName = $fldArray[$lint];
                    $headers = @{
                        'Authorization' = "Bearer $AccessToken"
                        'AnchorMailbox' = "$MailboxName"
                    }
                    $RequestURL = $RequestURL += "`&`$filter=DisplayName eq '$FolderName'"
                    $tfTargetFolder = (Invoke-RestMethod -Method Get -Uri $RequestURL -UserAgent "GraphBasicsPs101" -Headers $headers).value  
                    if ($tfTargetFolder.displayname -match $FolderName) {
                        $folderId = $tfTargetFolder.Id.ToString()
                        $RequestURL = $EndPoint + "('$MailboxName')/MailFolders('$folderId')/childfolders?"
                        $RequestURL += "`$expand=SingleValueExtendedProperties(`$filter=" + $Props + ")"
                    }
                    else {
                        throw ("Folder Not found")
                    }
                }
        So for each folder Step I'm finding the intermediate folder using $filter=DisplayName eq '$FolderName'

        To make the results more useful I've included a few extended properties that give me some extra information

        The first is the FolderSize, which in Mapi is the PidTagMessageSizeExtended property on the folder 

        The second is the pidTagEntryId (PR_EntryId)property which I added in so I could easily convert this into the folderId format that is used in the Office365 compliance search eg in Office365 when you do a compliance search you have the ability of using the folderid:xxxx keyword in a Search to limit the search of a Mailbox to one particular folder in a Mailbox. There is a script in https://docs.microsoft.com/en-us/microsoft-365/compliance/use-content-search-for-targeted-collections?view=o365-worldwide which uses the Get-MailboxFolderStatistics cmdlet which I found a little cumbersome so having a simple method like the above can return the id i need for the folder i want. eg this is what the end result looks like when you run the script



        The REST request that is generated by the script looks like (if you want to try this in the graph explorer)

        https://graph.microsoft.com/v1.0/users('gscales@datarumble.com')
        /MailFolders('MsgFolderRoot')/childfolders?
        $expand=SingleValueExtendedProperties($filter=(Id%20eq%20'Long%200x66b3')
        %20or%20(Id%20eq%20'Binary%200xfff'))
        &$filter=DisplayName%20eq%20'inbox'
        There are a bunch more things you can do with this type of query eg working with the retention tags on a folder. Or using the FolderId to then process the Items within that folder. The reason i started with this function is for me its always a jumping off point for starting working with mailbox data.

        Modifying your EWS Managed API code to use Hybrid Modern Authentication against OnPrem Mailboxes

        $
        0
        0
        In this post I'm going to look at what you need to do in your EWS Managed API code to support using Hybrid Modern Authentication where previously you've been using Basic or Integrated Authentication (both of which are susceptible to password spray attacks). If you don't know what Hybrid Modern Authentication  is put simply it brings to Exchange OnPrem email clients the security benefits of Modern Authentication offered by Azure AD to Office365 tenants. If your already using OAuth to connect to Office365 you have most of the work already done but you will still need logic to ensure you have the correct Audience set in your token when that code is used against an OnPrem Mailbox. 

        Prerequisites 

        You need to be using Hybrid Exchange or more specifically 

        Hybrid Office 365 tenant is configured in full hybrid configuration using Exchange Classic Hybrid Topology mode ref https://docs.microsoft.com/en-us/exchange/clients/outlook-for-ios-and-android/use-hybrid-modern-auth?view=exchserver-2019 

        If you don't want to enable Hybrid Modern Authentication but still want to use oAuth in EWS you can do it and there is a good article by Ingo on how to do this  https://practical365.com/exchange-server/configure-hybrid-modern-authentication-for-exchange-server/


        Authentication - Acquiring the Token

        This is where you need to make the most changes in your current code as you will now need some logic that can be used to acquire the oAuth Tokens from AzureAD. The easiest way of doing this is to use one of the Authentication libraries from Microsoft either ADAL (if you already have this implemented in your code) or preferably use the MSAL library. The difference between ADAL and MSAL is ADAL uses the v1 Azure oauth endpoint and MSAL uses the v2 there is a good description of the differences between the two endpoints  https://nicolgit.github.io/AzureAD-Endopoint-V1-vs-V2-comparison/

        Getting the intended Audience value for you Token Request 

        The audience of an oAuth token is the intended recipient of the token (or basically the resource its going to be used against) , in our Exchange EWS context this is the host-name part of the EWS External endpoint. In Office365 the EWS Endpoint will be https://outlook.office365.com/ews/exchange.asmx so the intended Audience of a token will be https://outlook.office365.com if you look at an Office365 token is jwt.io this is what you see eg



        When your using Hybrid Modern Authentication the Audience value for your token will become the external EWS endpoint's host-name of your OnPrem server (generally what you have configured in get-webservicesvirtualdirectory)

        In the Authentication libraries this Audience is passed differently in

        ADAL v1 Azure Endpoint its passed as the resourceURL

        String ResourceURL =""https://outlook.office365.com";
        var AuthResults = AuthContext.AcquireTokenAsync(ResourceURL ,
        "xxxxx-52b3-4102-aeff-aad2292ab01c",new Uri("urn:ietf:wg:oauth:2.0:oob"),
        new PlatformParameters(PromptBehavior.Always)).Result; 

        In MASL v2 Azure Endpoint its passed as part of the scope 

        string scope ="https://outlook.office365.com/EWS.AccessAsUser.All";
        PublicClientApplicationBuilder pcaConfig =
        PublicClientApplicationBuilder.Create(ClientId).WithAuthority(AadAuthorityAudience.AzureAdMultipleOrgs);
        var IntToken = pcaConfig.Build().AcquireTokenInteractive(
        new[]{ scope }).ExecuteAsync().Result;
        With Hybrid Modern Auth in the above examples outlook.office365.com would be replaced with the host name for your external EWS endpoint which you would obtain usually via Autodiscover

        AutoDiscover 

        Autodiscover in Exchange from Exchange 2007 has been there to help you basically discover the internal or external endpoint you need for whatever API your using. It is however an Authenticated Endpoint so when you remove Basic/Intergrated Authentication your code needs to be able to deal with this change. There are two ways you could go about this the first is generate a OAuth token first and use that to make the Authenticated traditional Auto-discover request. Or the second way is to use Autodiscover v2 (or Autodiscover json) which allows you to make an unauthenticated autodiscover requests to return the API endpoint you want. 

        If your code targets predominately Office365 and Hybrid tenants then just switch to use Autodiscover v2, if you have a mix of Office365, Hybrid and OnPrem islands then you still need the legacy Basic/Integrated Auth method for these OnPrem clients. The Approach that I'm taking in this post is to first do a Realm discovery against Office365 to determine if a particular set of credentials is an Office365 or Hybrid account.If it is then a v2 Autodiscover request will be made against Office365  and if not fail back to the legacy code. This isn't 100% guaranteed to work for some OnPrem (especially pre Exchange 2016) and account combinations so my advice is you always make sure your try/catch autodiscover logic includes at least one legacy auto discover-attempt as a last fail back. And make sure you do some regression testing on your code change against Exchange 2013. 

        For doing a simple JSON based Autodiscover against Office365 this can be done in a few lines with httpclient in c# 

        String MailboxName ="gscales@datarumble.com";
        String EWSEndPoint = $"https://outlook.office365.com/autodiscover/autodiscover.json/v1.0/{MailboxName}?Protocol=EWS";
        HttpClient httpClient =new HttpClient();
        httpClient.DefaultRequestHeaders.UserAgent.ParseAdd("Mozilla/5.0 (compatible; AcmeInc/1.0)");
        dynamic JsonResult = JsonConvert.DeserializeObject(httpClient.GetAsync(EWSEndPoint).Result.Content.ReadAsStringAsync()
        .Result);
        Console.WriteLine(JsonResult.Url);
        Or in PowerShell you could do it as a one-liner
        (Invoke-WebRequest -Uri https://outlook.office365.com/autodiscover/autodiscover.json/v1.0/gscales@datarumble.com
        ?Protocol=EWS
        | ConvertFrom-Json).url
        When your submitting an Autodiscover request against Office365 if your mailbox is OnPrem and you have HMA configured you will get returned your OnPrem EWS endpoint. 

        EWS Managed API 

        So what does this look like in the context of your EWS Managed API code, let first look at the traditional code path for Autodiscover

        ExchangeService service =new ExchangeService(ExchangeVersion.Exchange2013);
        service.Credentials =new WebCredentials("user1@contoso.com","password");
        service.AutodiscoverUrl("user1@contoso.com", RedirectionUrlValidationCallback);
        Here is what it would look like use a Realm Discovery and then a Json Autodiscover and some MSAL code to do the Token Acquisition otherwise following the above logic.

        A PowerShell version of the same thing would look like

        If you want a library free version that uses Invoke-WebRequest it would look like


        Dealing with Token Refresh (Important)

        Access tokens by default in Azure are valid for 1 hour, so if your application is going to run for a long period of time or is persistent then you will need to manage token expiration and refresh. If your using one of the Authentication libraries then they can perform this for you automatically however they do rely on you calling their methods before you make any authenticated EWS call. Currently the EWS Managed API doesn't offer a callback to help you integrate easily with an Authentication library (for doing the Token Refresh management) so you will need to come up with your own method of doing this (eg a simple method to check the token before any operation could be used). Or you can modify the EWS Managed API source to integrate your own callback eg a good place to look is PrepareWebRequest in https://github.com/OfficeDev/ews-managed-api/blob/70bde052e5f84b6fee3a678d2db5335dc2d72fc3/Credentials/OAuthCredentials.cs . The good thing about modifying the source is that you fix the issue for any operation that you code will do now and into the future. 


        Modifying your EWS WSDL Proxy Code for Modern Authentication

        $
        0
        0
        This is a follow-on from my last post on Modifying your EWS Managed API code to use Hybrid Modern Authentication against OnPrem Mailboxes . If instead of the EWS Managed API you are using EWS Proxy Code (generated from the EWS WSDL) and you want to migrate it to using Modern Authentication for Office365 and/or Hybrid here's a method you can use using the MSAL Authentication library.

        Unlike the EWS Managed API the WSDL generated proxy classes and specifically the ExchangeServiceBinding class doesn't have any provision to use Token Credentials. One way of implementing this in .NET is to take advantage of  Polymorphism and create a new class that is derived from the ExchangeServiceBinding class and then override the method GetWebResponse from this class (which is actually derived from the SoapHttpClientProtocol class which contains the actual method we are going to override https://docs.microsoft.com/en-us/dotnet/api/system.web.services.protocols.soaphttpclientprotocol.getwebrequest?view=netframework-4.8 )

        At the same time we can also add the X-AnchorMailbox header into the request which is also recommended for any Exchange Online requests you make. And because this method is called before every EWS Request we can place our Token Refresh code in there. In this example I'm using which uses the MSAL all you need to include is code that fetches the token from the TokenCache, this will trigger a Token Refresh if need or ultimately throw to Interaction if the Refresh Token isn't available. So here is a basic C# Console App that can do Hybrid/Modern Auth discover using the MSAL library. If you want the project files you can download them from here



         

        Using 2 Authentication factors (for MFA) in an unattended PowerShell Script

        $
        0
        0
        MFA (Multi Factor Authentication) is great at making the Authentication process more secure in Exchange Online but can be challenging in Automation scenarios. I originally wrote this code for something that I wanted to run unattended on a RasberryPi that was running PowerShell that i wanted to use MFA on and where i wanted to avoid going down the path of using the 90 day RefreshToken/device code method and I also didn't want to use App Authentication via Certificates or Client Secrets.

        Interestingly while i was writing this post Microsoft just announced Certificate based Modern Auth in Exchange Online PowerShell https://techcommunity.microsoft.com/t5/exchange-team-blog/modern-auth-and-unattended-scripts-in-exchange-online-powershell/ba-p/1497387  .  This article also links to the Secure App Model https://docs.microsoft.com/en-us/powershell/partnercenter/multi-factor-auth?view=partnercenterps-3.0#exchange which is the way Microsoft are recommending you handle MFA in unattended delegate scenarios so this is an alternative to this if you can't go down that path .

        One thing to note about any unattended method is that by there nature (eg because they aren't interactive) they can all potentially be unraveled by a bad actor if they make it passed the outer security of the machine where the script is running. Eg an unlocked machine or week logon security to that machine that would allow an attacker to access the stored RefreshToken/Certificate/SharedKey (no matter where they are stored or how they are encrypted). However the more factors you can put into your unattended process eg encrypting the stored cert/key etc the slower its going to be for anybody trying to unravel it. The gold standard would be an Secure Azure VM where the secrets themselves are in a KeyVault and use a managed identity but in the real world this is not always possible. In security terms you should always consider what the implications are when the security around your unattended process fails (and not if it fails). 

        Factors

        The two factors in my script that I'm going to be using is firstly a Username and Password which is really the only Primary Authentication method that currently lends itself to be used unattended (eg FIDO and OATH requires some level of interactivity). The additional second factor I'm using is TOTP (Time based One Time passwords) which are relatively easy to generate as they are based on https://tools.ietf.org/html/rfc6238 . Other people have already created a few Powershell functions for generating TOTP's so the one I decided to use was https://www.powershellgallery.com/packages/SecurityFever/2.2.0/Content/SecurityFever.psm1 which just requires that you pass in the SharedSecret that you get when you add a (3rd Party Authenticator)to your account in Office365 as an additional authentication method . To do this you should select to add an Authenticator application as an additional authentication method and select the "Configure app without notifications" from the screen that shows the QRCode. You then should be presented with a secret (shared)key (if it has spaces in it you will need to remove these) which you then need to feed into the Get-TimeBasedOneTimePassword cmdlet which will provide the verification code to successfully add the authenticator eg

        Get-TimeBasedOneTimePassword -SharedSecret txxxxxxxxxxxx

        You then need to store this sharedsecret as securely as you can as you will use this each time you want to logon to generate the TOTP factor for the authentication. A few ways of storing this is firstly use an Azure key vault, if you have a Windows machine your running it on then consider the credential store which can be used easily via another Powershell module https://www.powershellgallery.com/packages/CredentialManager/2.0 and there's a simple demo of using it https://github.com/pnp/PnP-PowerShell/wiki/How-to-use-the-Windows-Credential-Manager-to-ease-authentication-with-PnP-PowerShell . (I would suggest you put a second layer of encryption on top of this so whatever goes into the credential store is also encrypted)

        MFA Code

        So far I've been using a whole bunch of other peoples code but now comes to the bit i wrote which does the MFA authentication using OpenId connect https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-protocols-oidc and then gets an AccessToken once an authorization_code is acquired. This code is relatively straight forward it starts the normal browser based authentication flow maintaining the session using Invoke-WebRequest's -WebSession parameter and parses out the context, flow and canary tokens from the response which are needed in various parts of the process. For the additional authentication factor the TOTP is used.

        Example 

        While my code was written for EWS primarily running on a Rasberry PI a more relevant sample is a Exchange Online PowerShell connection eg 

        So this simple example uses the CredentialManger PS module to grab a ShareSecret from the credential store and generate a TOTP and then you have a normal set of PSCredentials and the "BasicAuthToOAuthConversion=true" query-string does the rest.

        A few things to remember is that Access Tokens are only good for 1 hour so if you expect your script to run for longer then this you will need to put in a method of refreshing the Token (or just generate a new access token). 

        I've put the code for the AzureMFAOTPv2 module on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/AzureMFAOTPv2.ps1

         






        Modifying your Exchange Online PowerShell Managed Code to use oAuth and MSAL

        $
        0
        0
        While not as popular these days many .net developers may have in the past used Managed code to run Exchange Online PowerShell cmdlets to do things like assign Mailbox Permissions or run other EXO PowerShell Cmdlets to get reporting information where no other alternatives where available (or are still available). The majority of these code bases are most likely using basic authentication using something like


        In this post I'm going to cover how to change your existing code, you might want to consider however making use of some of the new ExchangeV2 Powershell module functionality to improve performance and security . But to migrate existing code to use oAuth from Basic Authentication is relatively straight forward
        1. You will need some code to do the Authentication, for this I'm going to use the MSAL library because its both the recommended library from Microsoft and its easy to use. 
        2. You should create your own Azure App registration and consent to it that has the Exchange.Manage Permissions eg


        (If you can't create your own app registration you can use the well-known ClientId from the V2 PowerShell Module which I've used in the below samples).

        Once you have your authentication code generating a Token you then use that as the Password in the PSCrednetial object you pass in the WSManConnectionInfo object. The one thing you need to change is the WSManConnection URI to include the parameters DelegatedOrg which should be set to your domain and add  BasicAuthToOAuthConversion=true eg so your connection string should look like
        https://outlook.office365.com/powershell-liveid?DelegatedOrg=youdomain.onmicrosoft.com&BasicAuthToOAuthConversion=true
        eg an Interactive Auth sample to run Get-Mailbox would look like 

        If you need your code to run non-interactively with a set of credentials you can use the ROPC grant like

         



        Graph Basics Get the User Photo and save it to a file (and resize it) with PowerShell

        $
        0
        0
        This is part 2 of my Graph Basic's series and this post is born out of an actual need that I had over the last week which was to get a user photo from the Microsoft Graph and save it as a custom size and different image type. Like many things there are multiple ways of doing this but the Microsoft Graph GetPhoto endpoint is pretty straight forward and delivers the image in one of the following formats 48x48, 64x64, 96x96, 120x120, 240x240, 360x360, 432x432, 504x504, and 648x648. Because I wanted to use the photo on a Elgato stream deck this required the size be 72x72 so I needed some extra code to do the resize of the photo and change the format from a jpeg to png.

        Getting the user-photo from the Microsoft Graph 

        Before you can get the user's photo from Microsoft Graph you need to make sure the application registration you are using has one of the following permissions

        User.Read, User.ReadBasic.All, User.Read.All, User.ReadWrite, User.ReadWrite.All

        Then after you have obtain the Token make a request to Graph like

        https://graph.microsoft.com/v1.0/users('user@domain.com')/photos/240x240/$value 

        The results of this Get Request will be a Jpeg image 240x240 for that user, if you use Invoke-WebRequest you can simply then just use the  -Filename parameter to specify the output filename and that it.

        My function to download and resize the user photo looks like


        So if you just want the user photo in the native format (which will be jpeg) use

        Get-GraphUserPhoto -MailboxName user@domain.com -Filename c:\temp\dnlPhoto.jpeg -PhotoSize 240x240

        If you want to get the user photo and resize it (72x72) and save it as a png

        Get-GraphUserPhoto -MailboxName user@domain.com -Filename c:\temp\dnlPhoto.png -PhotoSize 240x240 -ReSizeDimension 72 -ReSizeImageForamt Png

        Testing and Sending email via SMTP using Opportunistic TLS and oAuth in Office365 with PowerShell

        $
        0
        0
        As well as EWS and Remote PowerShell (RPS) other mail protocols POP3, IMAP and SMTP have had OAuth authentication enabled in Exchange Online (Official announcement here). A while ago I created this script that used Opportunistic TLS to perform a Telnet style test against a SMTP server using SMTP AUTH. Now that oAuth authentication has been enabled in office365 I've updated this script to be able to use oAuth instead of SMTP Auth to test against Office365. I've also included a function to actually send a Message.

        Token Acquisition 

        To Send a Mail using oAuth you first need to get an Access token from Azure AD there are plenty of ways of doing this in PowerShell. You could use a library like MSAL or ADAL (just google your favoured method) or use a library less approach which I've included with this script . Whatever way you do this you need to make sure that your application registration https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app has the following permissions scope https://outlook.office.com/SMTP.Send . One thing to note is that Application permissions aren't supported at the moment so you need to use one of the Delegate Authentication flows (which means having a service account for SendAs other user scenarios).


        Adding the SASL XOAUTH2 header

        This was really the only thing I needed to change in the initial script apart from adding in code to get the OAuth token. The SASL header looks like the following

        base64("user=" + userName + "^Aauth=Bearer " + accessToken + "^A^A")

        The ^A character is a Control code which relates to character 01 in the ASCII Character set which corresponds to SOH (Start of Heading).

        Testing out OAuth


        To test out oAuth against Office365 use something like the following

        Invoke-TestSMTPTLSwithOauth -ServerName smtp.office365.com -SendingAddress gscales@datarumble.com -To gscales@datarumble.com -ClientId {your AzureApp Registration Id} -RedirectURI {Your redirect URI}

        which should give you an output like
         

        Actually Sending a Message

        As well as the SMTP Mail Conversation Test function, I also included a function that would allow you to actually send an Email Message using SMTP,TLS and oAuth. As the System.Net.Mail.Message class is now obsolete (which also takes Send-MailMessage along with it in term of sending via oAuth) there is no way of easily sending a Message without a third party library like MailKit (which is actually a really good library and supports things like Dkim etc). To get the Message Send to work I used the System.Net.Mail.Message and then used reflection to substitute a MemoryStream into the Send function so I could get the Message from this class as a MimeStream. This stream can then be sent as part of the SMTP DATA verb  (minus the X-Sender/X-Reciever Headers).  As mentioned in the Token Acquisition if you want to send as another user you need to have the normal Exchange SendAS permission granted to the Delegate account you using. To Send a Message with an Attachment use something like

        Invoke-SendMessagewithOAuth -ServerName smtp.office365.com -SendingAddress jcool@somedomain.com -To gscales@somedomain.com -Subject "This is a Test Message" -Body "Test Body" -AttachmentFileName "c:\temp\olm.csv" -userName gscales@datarumble.com -ClientId {your AzureApp Registration Id} -RedirectURI {Your redirect URI}

        Which will produce a conversation like



        MailKit

        If your reading this post because you have existing code that needs to be converted to use oAuth then the library you want to use in either PowerShell or C# is MailKit . For Powershell I would check out the https://www.powershellgallery.com/packages/Mailozaurr/0.0.9 module that looks pretty good, with C# here is a simple example that use MSAL and MailKit


        Viewing all 241 articles
        Browse latest View live


        <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>