Protection for Microsoft Teams

This article explains how to set up and configure Protection for Microsoft Teams, and describes its main features.

Introduction

Protection for Microsoft Teams extends Mimecast's world-class URL and attachment inspection capabilities to messages shared in the Microsoft Teams platform. Any content deemed malicious or suspicious is blocked, and a notification is sent to both the sender and the recipient.

Key features include:

  • Best-in-class inspection of all URLs and attachments.
  • A 14-day historical look back to identify previously delivered malicious content.
  • Unified dashboard for email and Microsoft Teams.
  • End-user notification of blocked content.
  • Optimized default policy out of the box.
  • Ability to create custom policies for specific Microsoft Teams channels.
  • Full deployment in minutes.

How harmful items are managed:

  • Harmful attachments are removed from Microsoft Teams conversations and the Microsoft Teams files space in SharePoint.
  • SharePoint links are also scanned, and if they are found to be harmful, they are removed.
  • Messages with harmful URLs are removed.
  • Administrators can access deleted attachments via the Email Security, Cloud Integrated Suite's Detections page.

Considerations

  • Phishing Detection Sensitivity and Untrustworthy settings do not apply to MS Teams.
  • Microsoft Teams doesn't have the option to turn off external chat requests. See Communication with external users.
  • When identifying potential phishing emails related to Microsoft Teams, be cautious of:

    • Unexpected invites from unknown senders.
    • Emails with generic subject lines like "You've got a new Teams Message".
    • Attachments that prompt you to enter credentials.
    • Links that require additional verification steps.

    Always verify the sender's authenticity and avoid clicking on suspicious links or downloading unexpected attachments.

Prerequisites

  • Existing Email Security, Cloud Integrated account.
  • You are using Microsoft Teams.
  • You have Microsoft 365 Global Administrator Role to grant app consent.
  • Your End Users have a Microsoft 365 E5 license with Microsoft 365 Information Protection and Governance [Add-on]. See the Microsoft Teams API page for more details.
  • It is your responsibility to ensure that you have in place all required licenses and permissions needed for you to access Protection for Microsoft Teams.

Getting Started

  1. Log into your Cloud Integrated suite.
  2. Navigate to More Mimecast Products from the left-hand menu.
  3. Click on Protection for Microsoft Teams.

Only a Partner Administrator can start a trial, if your account is associated with a Managed Service Provider.

Setting up Protection for Microsoft Teams

To configure Protection for Microsoft Teams, follow the steps below:

  1. Navigate to More Mimecast Products from the left-hand menu.
  2. Click on Protection for Microsoft Teams to view the product features.
  3. Click "Get Started" to begin.
  4. The Next Steps gives more information to guide you to complete the setup.
  5. Click Continue to review the Terms and Conditions for the trial.

The terms and conditions step will be skipped automatically if you are a new customer and have been provisioned with Protection for Teams Trial.

  1. Once the Terms and Conditions are accepted, we'll verify your details and take you to the policy setup.
  2. Select your default policy, Monitor (recommended), or Protect, then select Save & Continue To Microsoft.
  3. You'll be redirected to the Microsoft application consent page using your Microsoft Global Administrator Role login and consent to the permissions required.
  4. Once done, you'll be redirected back to the setup, which is now complete.

14-day Historic Scan

Mimecast will perform a historic scan and identify malicious URLs or attachments. Once the scan is complete, you can select the action you would like to take. You can see all detected threats or view the threats from the Detections page.

View Detection Events & Manual Threats Removal

You can view all scanned messages from the Detections page. By default, you'll see Malicious or Suspicious messages. You can click on a message to see full details.

If you're using Monitor mode or choose not to remove historic threats automatically, you can remove them manually by following the steps below:

  1. Navigate to the Detections page.
  2. Select Microsoft Teams as the Service filter and Status as Delivered.

  1. Select the message, click on the Remove button, and confirm.
  2. Once the message containing the malicious or suspicious URLs or attachments has been removed, the status of that message will be updated to Manually Removed.
  3. Blocked attachments can be downloaded directly from the details section.

Policy Management

The default policy protects your whole organization; however, if you need to make changes, then you can create a new policy using the following steps:

  1. Navigate to Policies and then Instant Messaging.
  2. Click on New Policy.
  3. Enter details for the Policy:

  • Select the target, All Users & Channels, or the Teams Channel.
    • Select a Mode - Protect, Monitor, or Disable.
    • Select the Detection Action.
    • Set your alert preferences.
    • Scroll to the top and click Save.
  • If you select Teams Channel, then you'll be able to look up the user's channels.
  • Policies can be used to create "exceptions," i.e., Target MSOC channel, Detection Action, Block Malware, Do nothing Phishing, Do nothing Suspicious.


 

Using Allow/Block Rules

If specific URLs need to be allowed, then this can be managed from Allow & Block Rules.


 

Communication with External Users

Protection for Microsoft Teams can only inspect URLs and attachments within your Microsoft Teams tenant. If employees accept invitations to collaborate using the Microsoft Teams tenant of a third party, Mimecast cannot inspect content. Based on our research, competitive products operate in the same way.

Our recommendation to customers is always to have third parties collaborate within the customer's Microsoft Teams tenant.

Scenarios when an external user is communicating within the customer's Microsoft Teams tenant:

  1. Internal user invites external users to a Channel - Inspection occurs.
  2. Internal user invites external users to a chat 1:1 - Inspection occurs.
  3. Internal user invites external users to a meeting (same as chat) - Inspection occurs.

Microsoft Teams does not support file uploads in 1:1 chats between internal and external users and meetings.


Scenarios when customers are communicating using a third party's Microsoft Teams tenant:

  1. External user invites internal user  to a Channel -  No protection.
  2. External user invites internal user  to a chat -  No protection.
  3. External user invites internal user  to a meeting -  No protection.
     

Permissions Required by Microsoft Teams

When setting up Protection for Microsoft Teams, permission for the following needs to be granted for the app to work:

  1. Read all groups.
  2. Read and write files in all site collections.
  3. Read all users' full profiles.
  4. Flag chat messages for violating policy.
  5. Read all chat messages.
  6. Read all channel messages.
  7. Flag channel messages for violating policy.
  8. Read the members of all teams.
  9. Read the members of all channels.
  10. Read the members of all chats.
  11. Sign in and read user profile.

Where Microsoft requires you to accept any Terms and Conditions as part of the permission granting process, it is your responsibility to fully review and understand the content of such Terms and Conditions before accepting them.  

 

Removing Mimecast App Consent

If your trial has expired or you no longer use Protection for Microsoft Teams, you can delete the app consent from the Azure Portal.

See Microsoft for more details.
 

Fair Usage Policy

Protection for Microsoft Teams is subject to fair usage limits. Each user is limited to a certain number of messages per month. Details can be found on Microsoft's page. If an account's fair usage limits are exceeded, Mimecast may throttle the service and will work with the customer to reduce the customer's usage to conform to that limit.

Mimecast also reserves the right to ask you to pay applicable excess usage fees in certain circumstances. 
 

Troubleshooting

In some instances, "Remove Failed" may appear under message Status for a specific item on the Detections page.

  1. This error indicates a license issue. The message could not be blocked, because the user does not have the license that is required. Please see Prerequisites, and ensure that you have all of the required licenses.

  1. This error indicates that we were unable to block a message, because we had previously blocked it. This can occur when a blocked message is edited by the sender with updated content that may include a harmful URL or file.

  1. This error indicates that we do not have permission to block the message. This happens when the sender is external and not part of the tenant we are protecting.

See Also...

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.