Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

SharePoint Timer jobs not running(Specially one-time timer jobs)

$
0
0

Problem description:

In your SharePoint farm you start to experience that the timer jobs are not running normally. Like custom timer jobs or one-time timer jobs do not run once you try running it. This also affects the UPA sync service not getting started or provisioning as they create one-time timer jobs. Another good example to identify the problem is to run "Merge-SPLogFile" command that creates one-time timer jobs to collect logs from all the servers in the farm.

Cause:

  1. Although the SharePoint Timer service was started in services.msc, the timer service instance object (this is a SharePoint farm object) may be set to “Disabled”.Use below script to get the status of all the timer instances in the farm.$farm = Get-SPFarm
    $FarmTimers = $farm.TimerService.Instances
    foreach ($FT in $FarmTimers)
    {
    write-host "Server: " $FT.Server.Name.ToString();
    write-host "Status: " $FT.status;
    write-host "Allow Service Jobs: " $FT.AllowServiceJobs;
    write-host "Allow Content DB Jobs: " $FT.AllowContentDatabaseJobs;"`n"
    }
  2. The "TimerJobHistory" table in SharePoint Config DB may have too many records which causes timer jobs not to function correctly. This is followed because "Delete job history" timer job not running for a long time. This job is supposed to purge timer job history older than 7 days.Use below SQL queries and commands to validate the possibility.

    >>SQL query on your SharePoint Config DB.
    Select COUNT(*) from [SharePoint_Config].[dbo].[TimerJobHistory] with(nolock)
    Select Top 100 * from [SharePoint_Config].[dbo].[TimerJobHistory] with(nolock) order by StartTime

    Output will show huge number of records e.g. 5 Million records. Also, will also notice that the oldest timer job history record("StartTime" column) in DB is for an older date (normal would be 7 days old). Older records are supposed to be purged by "Delete job history" timer job which, you can validate below.

    >>PowerShell command on your SharePoint server.
    Get-SPTimerJob | ?{$_.DisplayName -eq "Delete job history"} | flThis will return the "LastRunTime" of the timer job. Normally this will be date when the issue started.

Solution:

  1. For the 1st issue, you run below PowerShell command to re-provision the instance to "Online" state.$farm = Get-SPFarm
    $FarmTimers = $farm.TimerService.Instances
    foreach ($FT in $FarmTimers)
    {
    write-host "Server: " $FT.Server.Name.ToString();
    write-host "Status: " $FT.status;
    write-host "Allow Service Jobs: " $FT.AllowServiceJobs;
    write-host "Allow Content DB Jobs: " $FT.AllowContentDatabaseJobs;"`n"
    }
    $disabledTimers = $farm.TimerService.Instances | where {$_.Status -ne "Online"}
    if ($disabledTimers -ne $null)
    {
    foreach ($timer in $disabledTimers)
    {
    Write-Host -ForegroundColor Red "Timer service instance on server " $timer.Server.Name " is NOT Online. Current status:" $timer.Status
    Write-Host -ForegroundColor Green "Attempting to set the status of the service instance to online..."
    $timer.Provision()
    $timer.Start()
    write-host -ForegroundColor Red "You MUST now go restart the SharePoint timer service on server " $FT.Server.Name}}
    else
    {
    Write-Host -ForegroundColor Green "All Timer Service Instances in the farm are online. No problems found!"
    }
  2. For the 2nd issue, you will need Microsoft's engagement to clear out the "TimerJobHistory" table in SharePoint Config DB. This cannot be done as this is not a Microsoft supported DB operation. So, better engage Microsoft after opening a support ticket with them at https://support.microsoft.com/.
    Refer this article - Support for changes to the databases that are used by Office server products and by Windows SharePoint Services

Note: Dealing with Timer instances could cause other issues with the farm. Better to open ticket with Microsoft for such issues as these may be way complex as you think.


EWS Powershell Script to find the ‘True’ Read Status of an email message

$
0
0

Some business requires to track the READ status of some critical messages that was sent to all or some of the users. They want to know if the email was delivered to the Mailbox and was read by the user.

Here’s another script I created for one of my customer with such requirement. This is an EWS PowerShell script that is used to find the “True” read status of an email message.

Below is the report Generated from the script.

To automate the complete solution, you could create a script, but it is not in scope of this blog post.

  • Use Get-Messagetrackinglog and/or Get-Messagetrace (if you are in Hybrid, you will need to track both) to collect the recipients details to whom the message was delivered to.
  • Use an account that has Impersonation permissions on On-perm and Online if you are in Hybrid, and use it with this script to find the read status of the given Message. Something like below..

Get-ReadStatus is the function exported by the script. The script can be used against On-perm or Online mailboxes to find the Read / Unread status of an email message. The function returns a Custom Object with the READ status of the email. It first searches for the email message in the Inbox of the target mailbox with provided Search filters and if not found, it searches the Entire Mailbox using Search folders.

After reading the message, user may mark the message as 'UnRead' to review it later. This script retrieves the 'EverRead' status, an extended MAPI property value to report the Read Status of the message.

Features:

  • Uses Autodiscover to find the EWS URL, but for increased performance, one can specify the EWSUrl parameter manually.
  • Can be used against Office 365 or On-Perm mailbox. If the Email message was sent to Online and On-Perm mailboxes, use an account that has impersonation permissions on both org for easy tracking.
  • Searches Inbox first and if the message is not found in the Inbox Folder, then the script searches the entire mailbox.
  • Error message is returned on the result object, no separate log file parsing is required to find the complete status

Prerequisite   : Requires Powershell V3 or Higher.

                       : Requires EWS API 1.2 or higher installed on the machine.

 

Please leave a comment if you find it useful or for any questions.

SfB Server – Prerequisite installation failed: SqlInstanceRtcLocal

$
0
0

Recently while adding a new Front End Server to the existing Skype for Business Enterprise Pool we got the following message on SfB Deployment Wizard Step 1:

Prerequisite installation failed: Prerequisite installation failed: SqlInstanceRtcLocal For more information, check your SQL Server log files. Log files are in the folder C:Program FilesMicrosoft SQL ServerMSSQL*.RtcLocalMSSQLLog, where the * represents your SQL Server version number. For example, SQL Server 2012 uses this path: C:Program FilesMicrosoft SQL ServerMSSQL11.RtcLocalMSSQLLog.

After attempting to run Step 1 a second time the error message was slightly different:

Prerequisite not satisfied: SupportedSqlRtcLocal: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Shared Memory Provider, error: 40 - Could not open a connection to SQL Server)

The SQL Server (RTCLOCAL) service was installed but stopped:

We tried to start the service without success:

Windows could not start the SQL Server (RTCLOCAL) on Local Computer. For more information, review the System Event Log. If this is a non-Microsoft service, contact the service vendor, and refer to service-specific error code 5023.

And looking in Event Viewer > Windows Logs > System we could find two related errors:

Log Name: System
Source: Schannel
Date: 16/10/2017 18:35:40
Event ID: 36871
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: sfbfe04bck.recore.lab
Description:
A fatal error occurred while creating an SSL client credential. The internal error state is 10013.

Log Name: System
Source: Service Control Manager
Date: 16/10/2017 18:35:41
Event ID: 7024
Task Category: None
Level: Error
Keywords: Classic
User: N/A
Computer: sfbfe04bck.recore.lab
Description:
The SQL Server (RTCLOCAL) service terminated with the following service-specific error:
The group or resource is not in the correct state to perform the requested operation.

The error state 10013 is related to Enabled Protocols, we checked the enabled protocols and on this particular server TLS 1.0 was disabled for client and server:

Get-ChildItem -Path "HKLM:SYSTEMCurrentControlSetControlSecurityProvidersSCHANNELProtocolsTLS 1.0"

To re-enable TLS 1.0, we modified the following registry keys:

Set-ItemProperty -Path 'HKLM:SYSTEMCurrentControlSetControlSecurityProvidersSCHANNELProtocolsTLS 1.0Client' -Name DisabledByDefault -Value '0' -Type Dword
Set-ItemProperty -Path 'HKLM:SYSTEMCurrentControlSetControlSecurityProvidersSCHANNELProtocolsTLS 1.0Client' -Name Enabled -Value '1' -Type Dword

Set-ItemProperty -Path 'HKLM:SYSTEMCurrentControlSetControlSecurityProvidersSCHANNELProtocolsTLS 1.0Server' -Name DisabledByDefault -Value '0' -Type Dword
Set-ItemProperty -Path 'HKLM:SYSTEMCurrentControlSetControlSecurityProvidersSCHANNELProtocolsTLS 1.0Server' -Name Enabled -Value '1' -Type Dword

Note: After enable/disable protocols or cipher suites we need to restart the server.

Because the SQL Server Express installation failed, we also had to remove the RTCLOCAL instance by going to Control Panel > Programs > Programs and Features > Uninstall a program, select the SQL Server 2014 and then Uninstall/Change:

Now we use the option to Remove:

We will be prompted to remove the RTCLOCAL:

And we only need to remove the Database Engine Services:

In Ready to Remove we select Remove and wait for the RTCLOCAL to be removed:

Please also make sure that all the database files (*.mdf and *.ldf) related to the RTCLOCAL were removed:

(Get-ChildItem "C:Program FilesMicrosoft SQL Server*RTCLOCAL" -Include *.mdf,*.ldf -Recurse).count

Since we remove the RTCLOCAL instance we should restart the server again.

Finally, we should be able to successful run Deployment Wizard Step 1:

Please note that currently it's not supported to disable TLS 1.0 on any role related to Lync Server 2010/2013 and Skype for Business Server 2015.

As announced at Ignite 2017 the support will be available for Skype for Business Server 2015 in a future update.

 

Null Changes Do Not Get Updated to SharePoint Online’s User Profile Application

$
0
0

Scenario:

John was an employee of Fabrikam, Inc until he resigned to pursue his dream of professional fly fishing.  Michael, his direct manager and the SharePoint Administrator for the Fabrikam SharePoint Online tenant, wants to remove John from showing up in his Org chart.  Michael cannot delete the user object due to company policy, so he goes to Active Directory and updates John's user account to delete his name from the manager field of the account.  He closes and goes about his day, expecting John to drop off with the next sync between his on premises Active Directory and Office365.

The Complication:

The next day, the Michael notices that John's user account is still showing up.  He checks his ADConnect logs and sees no errors, so he connects to the Azure Active Directory PowerShell Module and runs the following command:

Connect-MSOLService
Get-MSOLUser -UserPrincipalName john@fabrikam.com

And sees that the changes have synced. Concerned, John opens a Service Request with Microsoft Support.

Technical Background:

SharePoint Online syncs data from directory service into the User Profile Application incrementally. A change to the user object in AD will update a hash value stored in the object used by the User Profile Application to determine if that object needs to be synced.  This is a performance feature that prevents the UPA from needlessly syncing objects that have no updates to them. In the case of null value syncs, the object hash is not updated, therefore the user never syncs to UPA, and Michael's update to John's user object never takes.

Resolution

After speaking with a Microsoft Commercial Services and Support engineer, Michael is directed to make a second update to John's user object, such as adding a birthday.  Michael opts to update the display name with a flag marking John as a terminated employee.  He then waits 24 hours and checks his Org chart the next day.  This time, John is no longer to be found.  Satisfied, Michael informs the engineer that the matter is resolved.

O365 and OneDrive with ADAL, Microsoft Graph API & Office Add-In

$
0
0

This post is a contribution from Manish Joshi, an engineer with the SharePoint Developer Support team

This blog demonstrates creating a Office add-in and retrieving data from SharePoint Online and One Drive. The add-in uses ADAL.js for authentication and retrieving access token. In addition it uses Graph API and SharePoint REST API for accessing data from One Drive and SharePoint Online.

Create new Azure Web app

  • Browse to Azure Portal https://portal.azure.com.
  • Click on App Services
  • Click  Add-> Web Apps-> Click Create
  • Set a unique App name. Example : o365officeaddin.
  • Set Resource Group: Either Create New or Select Existing. Click on Create.
  • Browse back to app services to seethe newly created web app.
  • Click on the newly created web app ->  Click Get publish profile from the menu in right section.
  • Click Save File-> The file will be saved in "C:Users<<username>>Downloads" folder with name o365officeaddin.PublishSettings

 

Register new Azure App

  • Click on Azure Active Directory-> Click Switch directory
  • Select your corporate Directory. In my case it’s Contoso.
  • Click App registrations-> Click New application registration
  • Enter Name: o365officeaddin
  • Select Application type: Web App/ API
  • Enter Sign-on URL as the url of the newly created azure web app. Example : https://o365officeaddin.azurewebsites.net/Home.html
  • Click on Azure Active Directory-> Click App registrations-> Click o365officeadin-> Click Properties
  • Set App ID URI: https://yourO365tenant.onmicrosoft.com/o365officeaddin. Click Save.
  • Edit the reply Urls and add the url of your web app with an additional query string “?et=”
    Example -: https://yourO365tenant.onmicrosoft.com/o365officeaddin/Home.html?et=
    Note the reply urls are cases sensitive and you may get an error if there is a case mismatch.
  • Make a note of "Application Id" we will use this later in javascript code for the clientId.
  • Click Manifest-> Update value for “oauth2AllowImplicitFlow” to "true". Click Save

 

Grant Permissions to the new Azure app

  • Click Required permissions-> Click Windows Azure Active Directory
  • Under "Delegated Permissions" check following:
    •  Sign in and read user profile
    •  Read and write directory data
    •  Access the directory as the signed-in user
      Click Save
  • Click Add-> Select an API
  • Click Microsoft Graph-> Click Select
  • Under "Delegated Permissions" check following:
    •  Sign in and read user profile
    •  Read and write directory data
    •  Access the directory as the signed-in user
    •  Have full access to user files
    •  Have full access to all files user can access
      Click Save
  • Click Select-> Click Done
  • Click Add-> Select an API
  • Click Office 365 SharePoint Online-> Click Select
  • Under "Delegated Permissions" check following:
    •  Read and write user files
    •  Run search queries as a user
    •  Read and write managed metadata
  • Click Microsoft Graph->Click Grant Permissions-> Click Yes
  • Click Select-> Click Done

 

Create new Word Add-in Project

    • Launch Visual Studio 2015-> Click New Project
    • Under Office/SharePoint-> Select Web Add-ins-> Select Word Add-in
    • Give Name: o365officeaddin-> Click OK
    • In Home.html Insert following code just after <body>. In the below code update the sharePointTenantName with your tenant name and the clientId with the Application Id we copied in the earlier steps.
      <div id="content-header">
              <strong><span class='app-user navbar-text'></span></strong>
      
              <div class="search-area">
                  <input id="tb-search-input" type="text" value="search query" />
                  <input id="btn-search" type="button" value="Search" /><input id="btn-onedrive" type="button" value="OneDrive" />
                  <div id="search-results">
                  </div>
              </div>
          </div>
      
          <script type="text/javascript">
              (function (window, $) {
                  // Azure AD App Manifest - Set 'oauth2AllowImplicitFlow' property to 'true' ("oauth2AllowImplicitFlow": true)
                  // https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-application-manifest
                  console.log('a')
                  var sharePointTenantName = 'sponull';
                  var config = window.config = {
                      tenant: sharePointTenantName + '.onmicrosoft.com',
                      clientId: 'feea2058-6ff6-4847-9dfe-854400becd24',
                      postLogoutRedirectUri: window.location.origin,
                      endpoints: {
                          graphApiUrl: 'https://graph.microsoft.com',
                          sharePointUrl: 'https://' + sharePointTenantName + '.sharepoint.com'
                      },
                      cacheLocation: 'localStorage'
                  };
      
                  var authContext = new AuthenticationContext(config);
                  var $userDisplay = $(".app-user");
                  var $searchInput = $('#tb-search-input');
                  var $searchButton = $('#btn-search');
                  var $searchResultsDiv = $('#search-results');
                  var $onedriveButton = $('#btn-onedrive');
      
                  var isCallback = authContext.isCallback(window.location.hash);
                  authContext.handleWindowCallback();
      
                  // Check Login Status, Update UI
                  var user = authContext.getCachedUser();
                  if (user) {
                      $userDisplay.html(user.userName);
                      $userDisplay.show();
      
                      $searchButton.click(function () {
                          console.log('b');
                          var searchText = $searchInput.val();
                          $searchResultsDiv.empty();
      
                          if (searchText.length > 0) {
                              search(searchText);
                          }
                      });
                      $onedriveButton.click(function () {
                          onedrive();
                      });
                  }
                  else {
                      authContext.login();
      
                      $userDisplay.empty();
                      $userDisplay.hide();
                      $searchInput.hide();
                      $searchButton.hide();
                  }
      
                  function search(searchText) {
                      var searchEndpoint = 'https://' + sharePointTenantName + '.sharepoint.com/_api/search/query?querytext='' + searchText + ''';
      
                      authContext.acquireToken(config.endpoints.sharePointUrl, function (error, token) {
                          if (error || !token) {
                              console.log(error);
                          }
                          else {
                              $.ajax({
                                  beforeSend: function (request) {
                                      request.setRequestHeader("Accept", "application/json");
                                  },
                                  type: "GET",
                                  url: searchEndpoint,
                                  dataType: "json",
                                  headers: {
                                      'Authorization': 'Bearer ' + token,
                                  }
                              }).done(function (response) {
                                  $searchResultsDiv.html(JSON.stringify(response));
      
                              }).fail(function (response) {
                                  console.log(response.responseText);
                              });
                          }
                      });
                  }
                  function onedrive() {
                      var onedriveEndpoint = "https://graph.microsoft.com/v1.0/me";
      
                      authContext.acquireToken(config.endpoints.graphApiUrl, function (error, token) {
                          if (error || !token) {
                              console.log(error);
                          }
                          else {
                              $.ajax({
                                  beforeSend: function (request) {
                                      request.setRequestHeader("Accept", "application/json");
                                  },
                                  type: "GET",
                                  url: onedriveEndpoint,
                                  dataType: "json",
                                  headers: {
                                      'Authorization': 'Bearer ' + token,
                                  }
                              }).done(function (response) {
                                  $searchResultsDiv.html(JSON.stringify(response));
      
                              }).fail(function (response) {
                                  console.log(response.responseText);
                              });
                          }
                      });
                  }
              })(window, window.jQuery);
      
          </script>
      

 

  • Add the below script reference in the head section
    <script src="//secure.aadcdn.microsoftonline-p.com/lib/1.0.0/js/adal.min.js"></script>
    

 

Publish and install the Add-in

    • Click Build-> Build Solution
    • Right click o365officeaddin project-> Select Publish
    • Current profile: Click <New…>
    • Browse to path where you have saved the o365officeaddin.PublishSettings file. Click Finish
    • Click Deploy your web project-> Click Publish
    • Once publish has succeeded Click Package the add-in
    • Change the URL to the url of the azure web app we created in previous steps. Example https://o365officeaddin.azurewebsites.net. Click Finish
    • Next it will generate the o365officeaddinManifest.xml file. This file will be generated in the folder "binDebugapp.publishOfficeAppManifests".
    • Browse to your Office 365 site
    • Browse to a document library
    • Edit an existing Word Document or create a new one -> Click Open-> Click Word Online
    • Click INSERT-> Click Office Add-ins
    • Click Upload My Add-In
    • Click Browse-> Browse to o365officeaddinManifest.xml location-> Click Upload
    • Click "Show Taskpane" to see your Add-in UI
    • Enter a search text in the search query box-> Click Search-> You will see JSON for the search results returned.

 

  • Click OneDrive button to see it get details of the current user from OneDrive using graph REST endpoint.

 

Installing the Add-in to App Catalog

We can make the add-in available globally instead of having to upload the manifest.xml file everytime. When we packaged the add-in in the earlier step the .app file for the add-in was also generated at the path "binDebugapp.publish".  Follow the below steps to make this add-in available globally across your tenant without requiring to upload the manifest.xml file every time.

  • Browse to the app catalog site collection in your tenant
  • Go to Apps for SharePoint
  • Upload the office365addin.app file from the location "binDebugapp.publish"
  • Install the add-in in the app catalog site collection by going to Site Actions -> Add an App
  • One the app is installed Browse to document library in any site collection in your tenant and add or edit a word document in browser.
  • Click on "Insert" -> "Office Add-ins" command in the ribbon . You should be able to see your office add-in under My Organization tab

Azure AD + 3rd party MFA = Azure AD Custom Controls

$
0
0

 

During Microsoft Ignite there were lots of announcements across a variety of Microsoft offerings including Azure Active Directory.

An interesting feature was released in preview called Custom Controls. Custom Controls allow integration of 3rd party security solutions and in this case, 3rd party multi-factor authentication providers.

I speak with many organizations throughout the year and although many of them are utilizing Azure Active Directory MFA, there are some that either require or prefer to utilize their investment in their current MFA solution.  So the question I’m often asked is, “does Azure AD support 3rd party MFA?”, we’ll I’m happy to say, yes it does.

By utilizing Azure Active Directory Conditional Access and Custom Controls, organizations can integrate their 3rd party MFA solution directly into the access controls to challenge access so customer, SaaS, and app published through Azure AD Application Proxy.


Requirements

  • Azure Active Directory Premium
  • 3rd party MFA solution such as Duo, RSA, and/or Trusona


Creating a custom control

To create a custom control, navigate to portal.azure.com and select Azure Active Directory

Select Conditional Access and then “Custom controls”

clip_image001

 

Next select “New custom control” at the top of the page

clip_image002

 

We’re now asked to paste in JSON for the control. This information provides the details about the 3rd party MFA provider. For example, I have DUO configured and my JSON is below:

Please review instructions your 3rd party MFA provider has published on how to access the JSON to integration with Azure AD.

 

clip_image004

 

Once the custom control for the 3rd party MFA is added, go back to the conditional access policies and create a policy to that will utilize the custom control.

Under Conditional Access select policies and “New policy”:

clip_image005

 

I configured a conditional access policy to use Duo with my Intranet app that is published through the Azure AD Application Proxy. Now I could have simply checked Azure MFA, however the purpose of this post is to demonstrate 3rd party MFA integration.

clip_image006

 

Let’s see it in action

AzureADandDuo

 

To learn more about Custom Controls please see: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-controls#custom-controls

ConfigMgr Compliance Baseline to verify Windows Activation Status

$
0
0

In most environments we are running KMS and don’t tend to think to often about our activation status but with more and more systems coming and going with BYOD and god knows what else your technicians are doing throughout your enterprise its good to have some piece of mind from a ConfigMgr perspective and validate that your systems are indeed licensed properly.  In this post ill provide the Powershell code for detection and remediation utilizing the KMS Client Setup Keys from https://technet.microsoft.com/en-us/library/jj612867(v=ws.11).aspx as well as the Compliance Baseline to import and implement for a few operating system types.  Feel free to extend for your supported operating systems.

Discovery Script:

#Windows Activation
if((cscript "$env:SystemRootsystem32slmgr.vbs" -dli all) -Contains "License Status: Licensed")
{
Write-Host $False
}
else
{
Write-Host $True
}

Remediation Script:

# Determine OS and Set KMS Client key

switch ((gwmi -Class Win32_OperatingSystem).Caption) {
"Microsoft Windows Server 2008 R2 Enterprise" {$License = "489J6-VHDMP-X63PK-3K798-CPX3Y"; break}
"Microsoft Windows Server 2008 R2 Standard" {$License = "YC6KT-GKW9T-YTKYR-T4X34-R7VHC"; break}
"Microsoft Windows Server 2008 R2 Datacenter" {$License = "74YFP-3QFB3-KQT8W-PMXWJ-7M648"; break}
"Microsoft Windows Server 2012 R2 Standard" {$License = "D2N9P-3P6X9-2R39C-7RTCD-MDVJX";    break}
"Microsoft Windows Server 2012 R2 Datacenter" {$License = "W3GGN-FT8W3-Y4M27-J84CP-Q3VJ9";    break}
"Microsoft Windows Server 2016 Standard" {$License = "WC2BQ-8NRM3-FDDYY-2BFGV-KHKQY";    break}
"Microsoft Windows Server 2016 Datacenter" {$License = "CB7KF-BWN84-R7R2Y-793K2-8XDDG";    break}
"Microsoft Windows 10 Enterprise" {$License = "NPPR9-FWDCX-D2C8J-H872K-2YT43";    break}
default {$License = "Unknown"}
}
#Windows Activation
If($License -ne "Unknown"){
Write-Host "Activating Windows..."
cscript "$env:SystemRootsystem32slmgr.vbs" -ipk $License //nologo
cscript "$env:SystemRootsystem32slmgr.vbs" -ato //nologo
}

Compliance Baseline:

Microsoft-Volume-License-Activation

If you have any feedback or want to provide more captions from Win32_OperatingSystem.Caption please comment with your additions and ill update the post.

Disclaimer: The information on this site is provided "AS IS" with no warranties, confers no rights, and is not supported by the authors or Microsoft Corporation. Use of included script samples are subject to the terms specified in theTerms of Use.

 

ConfigMgr OSD Model Check Task Sequence Step

$
0
0

Ok this step is super easy and in my opinion a must have when it comes to Operating System Deployment.  I cant count how many times a separate organization or technician for that matter grabs a system out from under their desk or purchases something off the internet without running it past their image engineering teams…  I know shocking this happens with all the standards we all have in place right… 

Usually when this happens its a model we DONT support we haven't tested and obviously do not have the drivers loaded for, Bios Configurations or any other model specific applications setup and tested for this model.  Low and behold we generally get a frantic call that imaging is DOWN!  Well that hardly ever is the case its usually an unsupported non tested model being imaged without our knowledge.

This step just makes so much sense and is so easy you should implement it immediately and save face before someone sends you chasing ghosts on a Friday right before the day is about to end… Smile  Blocking your Task Sequence from running on unsupported models might seem harsh to some but personally i value my time away from work and want to make the most of it and if we can block someone from doing something they shouldn't be doing in the first place and increase our personal time than that is a WIN in my book!

So the nitty gritty simply add a Run Command Line step as one of the first steps in your task sequence and copy this Powershell one liner into the command line field:

Powershell.exe -Command If(@('Surface Book','Surface Pro 4','Venue 11 Pro 7139','HP Probook 650 G2','Virtual Machine','VMware Virtual Platform') -contains (Get-WmiObject -Class Win32_ComputerSystem).Model){Write-Host "Supported Model Detected:"(Get-WmiObject -Class Win32_ComputerSystem).Model; Exit 0}Else{Write-Host "Error:Unsupported Model Detected:"(Get-WmiObject -Class Win32_ComputerSystem).Model; Exit 1633}

Make sure to add or remove any of the supported models in your enterprise to insure you have full coverage and enjoy your weekend!

Computer Models SQL Query:

select distinct model0 as ‘Computer Model’, COUNT(0) AS ‘Number of Machines’ from Computer_system_data group by model0 order by count(0) desc

Disclaimer: The information on this site is provided "AS IS" with no warranties, confers no rights, and is not supported by the authors or Microsoft Corporation. Use of included script samples are subject to the terms specified in theTerms of Use.


[mstep] 2017年11月 おすすめコースご案内 ~ データサイエンス概論 と IoTにおける機械学習の活用事例 他 【10/17 更新】

$
0
0

mstep

mstep は、マイクロソフト パートナー ネットワークへご参加のパートナー様がご利用いただける本格的なクラスルーム/オンライン トレーニングです。

お申し込みは先着順となり、定員に達し次第締め切らせていただきますので、お早めにお申し込みください。

mstep はMPNパートナー様の受講は無償となっておりますので、ぜひご活用ください。

 

mstepclassroom

******************************************************************************************************

11/1()

MCP 70-741 受験対策セミナー ~Networking with Windows Server 2016

 

<概要> 

MCP 試験 70-741:Networking with Windows Server 2016」の出題範囲に含まれる Windows Server 2016 のネットワーク 機能を解説し、ポイントとなる問題の解法をご確認いただきます。

なお、このコースは Windows Server 2016 のネットワーク 機能を 1 から学習するのではなく、試験対策に特化したカリキュラムとなります。Windows Server 2016 のネットワークの基本知識の習得には「MSU 20741 Networking with Windows Server 2016」のコースをご受講されることをお勧めします。

 

******************************************************************************************************

11/2()

データサイエンス概論 と IoTにおける機械学習の活用事例

 

<概要> 

近年、IoT の領域は、製造から小売、エネルギー業界まで多岐にわたり、ビジネスでの変革をもたらしています。実際にはセンサーなどデータ活用やビジネスへの応用は古くは 20 年前頃から始まって来ており、それ自体は決して新しいトレンドではありません。IoT のビジネス活用の誤解も多く存在しております。講演者は数多くの PoC (Proof of Concept) AI 関連ビジネス プロジェクトを数多く経験し、その知見をもとに IoT を中心とした AI ビジネス プロジェクトを推進しています。本講演では、データ サイエンスでできること、できないことを正しく理解していただき、機械学習の使いどころやメリットなどについて専門家でない方々でも理解できる内容でポイントやノウハウについて解説いたします。

 

******************************************************************************************************

11/15()

MCP 70-533 受験対策セミナー ~Microsoft Azure インフラストラクチャ ソリューションの実装~

 

<概要> 

MCP 試験 70-533:「Microsoft Azure インフラストラクチャ ソリューションの実装」の出題範囲に含まれる Microsoft Azure PaaS、および IaaS 機能を解説し、ポイントとなる問題の解法をご確認いただきます。なお、このコースは Azure サービスの機能を 1 から学習するのではなく、試験対策に特化したカリキュラムとなります。mstepITPro のためのMicrosoft Azure 仮想マシン基礎」および「Microsoft Azure PaaS 基礎」セミナーの受講を前提としております。

 

******************************************************************************************************

11/17()

MCP 70-346/70-347 受験対策セミナー ~MCSA Office 365 対応2試験~

 

<概要> 

MCP 試験 70-346 Office 365 ID と要件の管理」 / 70-347 Office 365 サービスの有効化」 の合格に必要な知識を習得する。(Office 365 や各 Online Service の基本的な管理については本コースでは扱いません)

 

******************************************************************************************************

[その他の公開コースはこちら]

mstepclassroom

※ mstep クラスルームは、コースの募集がすでに終了していることもあります。予めご了承ください。

 


msteponline

******************************************************************************************************

Windows 10 導入と展開(セッション) -Creators Update対応- (2017 6 )

 

<概要>

Windows 10 を企業内で展開するために必要な知識を習得するためのセミナーです。

******************************************************************************************************

Windows 10 セキュリティと管理 (セッション) - Creators Update 対応 (2017 6 )

 

<概要>

Windows 10 のセキュリティおよび管理に関する機能をご説明し、幾つかの機能はデモ形式でご紹介します。

******************************************************************************************************

IT Pro 向け 初めての Office 365 導入と移行 (2017 6 )

 

<概要>

Exchange Online 中心とした Office 365 への移行方法についてのトレーニングです。Exchange Online で採用可能な複数の移行方法について、特徴、要件、作業の流れ、選び方などを説明します。SharePoint Online への移行、Skype for Business Online の利用開始方法については概要をご説明します。

******************************************************************************************************

Active Directory による SaaS の認証 Office 365 認証管理・セキュリティ機能 (2017 6 )

 

<概要>

このコースでは、Azure Active Directory とオンプレミス Active Directory の連携、Active Directory フェデレーション サービスによる Office 365 やサードパーティ SaaS のシングル サインオンについて説明します。また、Office 365 の標準機能および Azure Active Directory Premium, Intune, Azure Information Protection を組み合わせたセキュリティ機能や、端末やアクセスの制御についても説明します。

******************************************************************************************************

MTA セキュリティの基礎 受験対策セミナー(2017 5 )

 

<概要>

Microsoft の認定テキストを使用した、MTA試験「98-367:セキュリティの基礎」の受験対策セミナーです。このコースでは、以下に関する基本的なセキュリティを学びます。

• セキュリティの現状を把握した今日のセキュリティの基礎

• 様々な攻撃手法の理解、攻撃に対する防御策

• オペレーションシステムの持つセキュリティ機能

• ネットワークのセキュリティとして、ファイアウォールの目的と仕組みや様々なネットワーク上のセキュリティ(有線・無線)

• 物理的なセキュリティ対策とクライアントとサーバーに代表されるコンピューターのセキュリティ

• アプリケーションのセキュリティ

******************************************************************************************************

 

[その他のオンラインコースはこちら]

msteponline

 

 

How Many Azure Subscriptions is enough? aka.ms/Azure/Subscriptions

$
0
0

How many Subscriptions is enough?

Comparative Anatomy of Forests to Subscriptions

This question come up too often and so it's time to share findings from my experiences and  many of my colleagues' at Microsoft.  While one key point to make is that "there is no magic silver bullet", the discussions here are similar to our first talks about how many Active Directory Forests should you have? In BOTH cases, simpler and fewer of either forests or subscriptions is easier to manage.  While conversely, the more you have of either, the more complexity and difficulty you will have in managing them. However, one important distinction to make with this analogy is with regards to Security:

  • With Active Directory, the security boundary IS the forest
  • With Azure, the security boundary is NOT the subscription, but the Azure Active Directory Tenant.

I tell my customer, let's learn the lessons of many organizations in the past that went crazy with multiple domains and forests ...only ending up later after much pain and administrative suffering, to undo it all and reduce/consolidate domains and forests to a few, if not only 1.  This should be the end-state goal you should start with for determining the number of subscriptions.  Ask your organization, can we start with 1?  And if so, what would be the justifications for adding additional subscriptions?

Criteria for adding subscriptions

The graphic below is like the "Project Management Body of Knowledge" in that this is simply a framework, and not carved in stone.  Feel free to disagree and challenge it, but please just don't tell me you need dozens of subscriptions, unless you can really justify them all.  Let's keep this simple.  CAVEAT EMPTOR: The following decision criteria assume a single organization and does not take into account mergers, acquisitions and divestitures.  That could be a entire separate blog on those possibilities.

Subscription Decision Criteria

Subscription Limits

This is the MOST likely and common clear cut justification for needing another subscription.  If any of the subscription limits will be hit now or in the near future, plan for another subscription now, and ward off the future troubles  having to move resources from one subscription to another.  While it is much easier to do this now, than in those dark days of ASM, prior planning is your friend here.

As a result, the place we don't want to be impacted in, is production. So many companies will just figure the wild west of development and testing could eat up valuable limits needed for production resources. The beginning state for many companies is to start with one subscription for production and then a second subscription for all non-production e.g. Dev, Test, and QA. An augmented version of this safety net is that we hear many customers that have a third "sandbox" environment where any thing goes.

Subscription Security

With Role Based Access Controls (RBAC) we can scope administrative access at

  • any subscription
  • any resource group
  • any Azure resource

However, when we go back to our comparative anatomy analogy above with Active directory, subscriptions have an all powerful equivalent to the old Enterprise Admin of Active Directory.  This is the subscription owner which inherits rights over every azure resource within your subscription.  How much do you trust that person(s)?  That is a lot of power holding the keys of the subscription kingdom! But, there are methods to protect that top level right, such as breaking up the password parts or other factors to be accessible only by different people.

Azure Resource Manager (ARM) was designed for this delegated administration barrier that existing only a few years ago, when our only scope of administration over Azure resources was just the subscription.  We built this new ARM model.  You just have to use it...  AND enforce it. It is a new way to administer and control resources versus traditional on-premises systems, but embrace it and leverage RBAC. And now that we have Azure PIM for RBAC in preview, this will make many security officers very happy.

Limit Resource Providers

Registration of Resource Providers enable features to be available to users in the portal. To register a resource provider, you must have permission to perform the /register/action operation for the resource provider. This operation is included in the Contributor and Owner roles.

In many cases on our projects, we provide a script to simply enable all resource providers so it is done.  This scenario of a different subscription with different resource providers lit up is not a common one. But we'll call it a possibility.  It could be possible to have a subscription with limited services, like production, and then another non-production or sandbox environment with everything turned on.  Not something we see or do often at all, but just wanted to put it on the table.

Delegation of Administration

As I described above, RBAC can really break down and control access to resources to manage least privileges.  Stuart Kwan at Ignite 2017 had a great talk on Identity (Locking Down access to the Azure Cloud...) and made this simple graphic to show how RBAC should be applied.

With all of Stuart Kwan's ultimate words of wisdom, there will be some who want further delegation like we did with nested organizational units in AD.  But I ask you "was that really easier and simpler than a more simple flat hierarchy"?  I've seen that model go more crazy than the numbers of forests gone mad, so I am not a fan of making additional subscriptions just for another administrative container to fake nesting of administration.

The best possible argument I think goes back up to the subscription owner and who do you trust. Remember our other security tenant above is that the Azure AD tenant and NOT the subscription, is the security boundary.

Conclusion

Again, there is no magic bullet.  But now you have some criteria to review and decide. And these criteria can be used also to reduce or condense the number of subscriptions you have.  So what if you do have a half a dozen or more subscriptions...what do you do?  I would say review the criteria above.  If possible without a lot of pain, consolidate and eliminate those subscriptions that are not necessary.  Apart from the first few criteria above, we typically see or recommend 1 subscription for production and 1 for non-production and or a sandbox environment.  There are exceptions, but these are guidelines and not cast in stone.

Recommended Reading

歡迎 Microsoft 365 F1 參與全球企業第一線員工的數位轉型

$
0
0

今天的貼文是由Office 365總經理Bryan Goode所撰寫。

 

在九月二十五日的Microsoft Ignite,我們發表了一個新版本的Microsoft 365 F1,激勵在這快速變動的數位時代下的第一線員工,—集結了Office 365、Windows 10與Enterprise Mobility + Security,傳遞一個完整、智慧的解決方案來激勵所有員工。

 

Modern Workplace需要公司滿足員工新的期待,連接更分散的工作力,並提供所有員工能創造、創新和合作解決客戶與業務問題的工具。一個真正的Modern Workplace,能讓員工發揮最大與最佳的獨創性,創建一個創新和行動的工作文化,歡迎並激勵所有員工,從執行團隊到第一線工作力。

 

第一線員工在全球工作力中佔大部分。全球共有二十億人,他們可能是櫃檯人員、接線生,或是在診所、賣場中工作。他們通常是第一個面對客戶的人,代表公司品牌,並呈現產品和服務。它們構成許多世界上最大行業的基幹,沒有他們,許多組織追求的目標就無法被實現。

 

我們看到了一個新的技術契機,給第一線員工更直觀、身歷其境、激勵人心的體驗。Microsoft以一個獨特的定位,透過我們的商務的產品,跨越Microsoft 365、Dynamics 365、Microsoft IoT、Microsoft AI和Microsoft HoloLens,以及Windows Mixed Reality ecosystem,來幫助企業挖掘第一線員工的工作潛力。

 

透過Microsoft 365 F1激勵每位員工,使我們讓第一線員工參與數位轉型的願景跨出了重要的一步。

 

影片連結:The Firstline Workforce and Microsoft 365

 

第一線工的數位轉型

 

Microsoft 365 F1的功能與工具讓每個員工將其想法變成行動。它透過互動式的Skype會議廣播和Yammer來促進公司文化和社群,幫助員工在公司中找到並分享最佳實踐。

 

Microsoft 365 F1讓公司的教育訓練和提升員工技能變得容易,透過Microsoft Stream來分享動態的、特定職位的內容和影片,而SharePoint可以簡易的發布到職和教育訓練的資料以及在安全的管理公司機密資料。

 

它支援第一線員工提高生產力和數位化商務流程,使用Microsoft StaffHub,一個專門為第一線員工管理工作排班的應用程式,和Microsoft PowerApps、Flow讓日常工作流程自動化。今天,我們宣布StaffHub推出的新功能,包括記錄員工的上/下班時間和追蹤工作任務的能力。我們讓員工能夠更輕易地與StaffHub保持連結,透過與Microsoft Teams的訊息整合,團隊合作的中心,以及強調在Yammer發布的公司訊息。最後,我們使客戶能夠將StaffHub連接到工作力管理系統和其他工具,透過通用API的可用性。

 

Microsoft 365 F1簡化IT管理,大幅地降低成本,並將安全性擴展到所有員工和端點。Azure Active Directory提供員工身份和存取權限的管理; Microsoft Intune幫助裝置安全; Windows 10的最新功能簡化了對第一線員工的管理,透過Windows Assigned Access來鎖定單用途裝置和Windows AutoPilot自動部署。

 

最後,我們體認到提供第一線員工有效率並具安全性的裝置的重要性,大幅地降低總成本。現在,我們正在透過OEM合作夥伴惠普、聯想和宏碁推出Windows 10 S新商業裝置。最低價位從$275起跳,這些裝置受益於基於雲端的身份和管理,是第一線員工工作環境的最佳選擇。

 

對於我們有機會激勵第一線員工感到非常興奮,而我們才剛剛開始!

 

想要了解關於我們願景的更多消息,請至我們新的「第一線員工」頁面,並參閱下表,以了解Microsoft 365 F1中包含的內容。

—Bryan Goode

 

Xbox One および Windows 10 PC 用ファミリー向けゲーム 3 製品『ディズニーランド・アドベンチャーズ』『ラッシュ: ディズニー/ピクサー アドベンチャー』『Zoo Tycoon: アルティメット アニマル コレクション』を 2017 年 10 月 31 日 (火) に発売

$
0
0

日本マイクロソフト株式会社 (本社 : 東京都港区) は、Xbox One および Windows 10 PC 用にご家族一緒に楽しめるアドベンチャー ゲーム『ディズニーランド・アドベンチャーズ』『ラッシュ : ディズニー/ピクサー アドベンチャー』および『Zoo Tycoon : アルティメット アニマル コレクション』を 2017 年 10 月 31 日 (火) に、パッケージ版参考価格 2,900 円 (税抜 / Xbox One のみ)、ダウンロード版参考価格 2,686 円 (税抜) で発売します。

ディズニーランド・アドベンチャーズ

今まで体験したこともないようなディズニーランドの魔法を体験しよう! リビングにいながらメインストリート USA からクリッターカントリーまでディズニーランドを探検しよう。ピーターパンと一緒にフック船長と対戦したり、ミッキーマウスとハイファイブしたり、白雪姫とハグしたり。物語が芽吹き、夢が実現する旅に出発しよう! 『ディズニーランド・アドベンチャーズ』は、お子さん、家族、そしてすべてのディズニー ファンの皆さんがディズニーランドを冒険し、アトラクションをもとにしたチャレンジ クエストに挑戦したり、かわいいディズニー キャラクターたちと一緒に遊ぶことができます。

ラッシュ : ディズニー/ピクサー アドベンチャー

すべてのディズニー ファンの皆さんをピクサー・アニメーション・スタジオの愛すべき 6 作品の世界でのかつてない大冒険にご招待します。美しいグラフィックでリマスターされた、"Mr.インクレディブル" "レミーのおいしいレストラン" "カールじいさんの空飛ぶ家" "カーズ" "トイ・ストーリー" "ファインディング・ドリー"のキャラクター達と、パズルを解いて、隠された謎を解き明かそう。サクサク進むテンポの良い謎解きから、心臓が脈打つような素早くスピード感のある一瞬まで、スクリーンの中のピクサー キャラクター達と冒険しよう。

Zoo Tycoon : アルティメット アニマル コレクション

愛らしい動物たちでいっぱいの、夢の動物園をつくろう! 動物園をつくるワクワク感が満載の、本格的な動物園経営を楽しめるシミュレーション ゲームが Xbox One および Windows 10 で登場。200 種類近くの動物たちに大好きな食べ物をあげたり遊び場をつくってお世話したり、来園者が楽しいひと時を過ごせるように広大な敷地に展示場や売店などの新しい施設を建てたりして、夢の動物園を自由に経営しよう。たくさんの来園者が訪れる大人気の動物園には、動物たちとの出会いと感動がいっぱいです! Xbox Live のオンライン プレイでは、最大 4 人プレイで動物園の建設や運営を楽しめます。また、動物たちの世話をする大規模なコミュニティに参加して、他の動物園を運営する園長仲間とのコミュニケーションを深めれば、理想の動物園づくりがもっと楽しくなります。

なお、3 製品とも Xbox Play Anywhere に対応し、ダウンロード版を一度購入すれば、追加費用無しで Xbox One と Windows 10 PC でゲームをプレイできます。 (ディスク版は Xbox One 版のみプレイ可能です)。すべてのセーブ データ、追加コンテンツおよび実績を保持したまま、別の Xbox One や Windows 10 PC で終了した時点からゲームを再開することができます。
さらに 4K、HDR 環境でのプレイに対応し、対応した Xbox One X および Windows 10 PC であればさらに美しい画面でキャラクター達と触れ合うことができます。

製品基本情報 : 『ディズニーランド・アドベンチャーズ』

タイトル表記 :
ディズニーランド・アドベンチャーズ
プラットフォーム :
Xbox One / Windows 10 PC
発売元 :
Microsoft Studios
開発会社 :
Asobo Studio
国内販売元 :
日本マイクロソフト株式会社
発売予定日 :
2017 年 10 月 31 日 (火)
参考価格 :
パッケージ版 : 2,900 円 (税抜)
ダウンロード版 : 2,686 円 (税抜)
レーティング :
CERO A (全年齢対象)
ジャンル :
ファミリー
コピーライト表記 :
© 2017 Microsoft Corporation. © Disney; © Disney / Pixar.
プレイ人数 :
1–2 人
Xbox Play Anywhere :
対応
画面解像度 (最大) :
4K UHD
言語 :
日本語
ストレージ容量 :
15 GB 以上
Windows 10 PC システム要件 :
最新情報は製品ページまたは Microsoft ストアを参照ください。
備考 :
パッケージ版は Xbox One でのみゲームをプレイできます。各デバイスでグラフィックス表現が異なります。機能および要件は変更となる場合があります。実際の販売価格は各販売店でご確認ください。

製品基本情報: 『ラッシュ : ディズニー/ピクサー アドベンチャー』

タイトル表記 :
ラッシュ: ディズニー/ピクサー アドベンチャー
プラットフォーム :
Xbox One / Windows 10 PC
発売元 :
Microsoft Studios
開発会社 :
Asobo Studio
国内販売元 :
日本マイクロソフト株式会社
発売予定日 :
2017 年 10 月 31 日 (火)
参考価格 :
パッケージ版 : 2,900 円 (税抜)
ダウンロード版 : 2,686 円 (税抜)
レーティング :
CERO A (全年齢対象)
ジャンル :
ファミリー
コピーライト表記 :
© 2017 Microsoft Corporation. © Disney / Pixar.
プレイ人数 :
1人 (Xbox Live マルチプレイヤー人数 2 人)
Xbox Play Anywhere :
対応
画面解像度 (最大) :
4K UHD
言語 :
日本語
ストレージ容量 :
25 GB 以上
Windows 10 PC システム要件 :
最新情報は製品ページまたは Microsoft ストアを参照ください。
備考 :
パッケージ版は Xbox One でのみゲームをプレイできます。各デバイスでグラフィックス表現が異なります。機能および要件は変更となる場合があります。実際の販売価格は各販売店でご確認ください。

製品基本情報 : 『Zoo Tycoon : アルティメット アニマル コレクション』

タイトル表記 :
Zoo Tycoon : アルティメット アニマル コレクション
プラットフォーム :
Xbox One / Windows 10 PC
発売元 :
Microsoft Studios
開発会社 :
Frontier Developments Ltd.
国内販売元 :
日本マイクロソフト株式会社
発売予定日 :
2017 年 10 月 31 日 (火)
参考価格 :
パッケージ版 : 2,900 円 (税抜)
ダウンロード版 : 2,686 円 (税抜)
レーティング :
CERO A (全年齢対象)
ジャンル :
シミュレーション
コピーライト表記 :
© 2017 Microsoft Corporation. Developed by Frontier Developments Ltd. for Microsoft Corporation. ‭Frontier, Cobra and the Frontier and Cobra logos are ‭trademarks of Frontier Developments Ltd. All rights ‭reserved. Cobra game development technology. ‭© 2017 Frontier Developments Ltd. All rights reserved.‬‬‬‬‬
プレイ人数 :
1 人 (Xbox Live マルチプレイヤー人数 2-4 人)
Xbox Play Anywhere :
対応
画面解像度 (最大) :
4K UHD
言語 :
日本語
ストレージ容量 :
25 GB 以上
Windows 10 PC システム要件 :
最新情報は製品ページまたは Microsoft ストアを参照ください。
備考 :
パッケージ版は Xbox One でのみゲームをプレイできます。各デバイスでグラフィックス表現が異なります。機能および要件は変更となる場合があります。実際の販売価格は各販売店でご確認ください。
.none{display:none;}
.info .row {
border-top: 1px solid #ddd;
margin: 0 auto;
width: 95%;
color: #333;
}
.info .row .col-sm-4 {
text-align: right;
font-weight: bold;
background-color: #fffcf9;
}
.info .row .col-sm-4, .info .row .col-sm-8 {
padding: 8px;
}
.row-eq-height {
display: flex;
flex-wrap: wrap;
}
@media(max-width:768px){
.info .row .col-sm-4 {
text-align: left;
}
.info .row .col-sm-4, .info .row .col-sm-8 {
padding: 5px;
font-size: 14px;
}
}

Windows 10, 1709 již dnes!

$
0
0

Windows 10, verze 1709 známá pod označením Fall Creators Update, je oficiálně dostupná od 17. října 2017!

Od tohoto data je tato verze k dispozici ve všech distribučních kanálech. Volume Licensing Service Center (VLSC) portálu, Windows Update for Business, Windows Server Update Services (WSUS) a pro předplatitele Visual Studio (MSDN) benefitů. Později během dne bude uvolněna i aktualizovaná verze Windows Assessment and Deployment Kit (ADK).

Drobnou změnou pro IT a všechny, kteří řeší hromadné nasazení, je fakt, že nově dostupné ISO systému bude obsahovat všechny edice. Tedy Education, Enterprise a Pro. Je tedy nutné v rámci WIM referovat na správné číslo indexu. V některých distribučních kanálech bude nadále možné získat jednotlivé edice samostatně. Podobně budou uvolněny i upgrade klasifikace do Windows Update.

A jaké novinky vlastně podzimní aktualizace přináší? Je jich nespočet. Možná ty nejzásadnější jsme vybrali níže, ale spíše doporučujeme pročíst kompletní seznam změn a otestovat právě ve vaší firmě.

  • On-Demand synchronizace pro OneDrive uspoří místo na disku a umožní i přesto vidět celý obsah online úložiště i bez jeho synchronizace
  • Panel Lidé pro rychlý přístup ke kontaktům a komunikaci s nimi
  • Připnutí webových stránek na hlavní panel jako u Windows 7
  • Čtení elektronických knih v prohlížeči Edge a lepší podpora PDF souborů
  • Řízený přístup ke složkám bránící ransomware nákaze a zašifrování souborů
  • Integrovaný EMET pro eliminaci nových a neznámých hrozeb
  • Propojení s mobilními telefony a možnost navázat na tamní práci
  • Vylepšený správce úloh i správa systémových služeb
  • Možnost limitů využití sítě službou Windows Update
  • Nový panel pro správu virtuálních strojů
  • Automatické čištění nepotřebných souborů či starších nepotřebných instalačních souborů Windows
  • Notifikační centrum sdružuje notifikace aplikací
  • Fluent Design zpřehledňuje jednotlivé menu a vrací lehce pocit průhlednosti z Windows 7
  • A další desítky a stovky příjemných drobností pro uživatele i IT

- Petr Vlk (KPCS CZ, WUG)

The Fourth Industrial Revolution – making disruption work for you

$
0
0

The Fourth Industrial Revolution. It's a big topic. Big enough for it to have played a central role in the World Economic Forum's Industry Strategy Meeting in June, and for it to be a key topic in the UK Regional session at Microsoft Inspire.

 

But what is the Fourth Industrial Revolution (4IR)? Let's start by pointing out that it's not IoT or big data. It's not Factory 4.0 or the smart factory. It's all of them - and more. The first industrial revolution mechanised production. The second created mass production. The third automated it. Now, the fourth is taking that digital model even further - faster than anyone could have expected. Along the way, it's disrupting every industry. Companies that used to be leaders in their sectors are now also-rans; start-ups, such as OnDeck and LendFriend, have become successful in a matter of months instead of years.

 

Change is here... and this time it's good.

 

The World Economic Forum puts it best: "The change brought by the Fourth Industrial Revolution is inevitable, not optional. And the possible rewards are staggering". It so happens that a lot of businesses have already started to cotton on to that fact. According to IDC, 70% of the top 500 global companies will have dedicated digital transformation and innovation teams by the end of this year.

 

It all adds up to a UK market opportunity worth around $55 million - and 4IR's just getting started, so it's likely that the future will hold even more opportunity.

 

So, where can you expect the big wins to come from? Probably from three main areas. Better use of information and smarter collaboration are set to give customers better experiences. The same elements should also sharpen up companies' approaches to innovation and creativity, so it's likely that there will be a change in the way they design and market products and services. In addition, these factors - as well as creative use of IT - are improving logistics, the supply chain and time-to-market.

 

Companies expect the world...

 

Some research by PwC suggests that companies agree. They expect big things of 4IR. Around 80% expect it to improve planning and control, while over 60% believe it'll improve customer satisfaction and make production more flexible. Plus, around half say it will reduce time to market . Many industry experts believe that 4IR will have a positive effect on society across the world as well.

Let's give them the stars as well

 

That all sounds like good news for Microsoft partners. To quote Judson Althoff, our Executive Vice President, Worldwide Commercial Business: "Devices, intelligence, the cloud, the rich cloud services that we bring to market together...these combine to drive digital transformation. It's a new world of opportunity for all of us." And together, we're in a great position to go after this opportunity. As our CEO Satya Nadella says, "We'll capture this by coming together to address customer needs through four digital transformation outcomes: modern workplace; business applications; data and AI; and infrastructure and apps."

 

This means that you'll be able to deliver the exciting new apps and services that 4IR will depend on, confident that you can help people to work together anywhere, yet still be secure.

 

We'll show you how we can all take advantage of the 4IR opportunity

 

The Fourth Industrial Revolution is an exciting prospect, with its potential to transform your customers - and your own businesses. Over the next few weeks, this series of blogs will bring you insights into its likely impact and how we can help you, our partners, turn it to your advantage. Watch out for the next in the series.

 

Introduction à HoloLens et à Windows Mixed Reality

$
0
0

Microsoft parle de plus en plus de « Mixed Reality » je profite donc de la sortie de Windows 10 Fall Creator Update pour faire le point sur ce sujet. Réalité Virtuelle/Augmentée/Mixte, Quezaco ?

 

Sans remonter trop loin dans l'histoire de l'informatique, l'interaction Homme-Machine passe principalement par un clavier et un écran, ces approches ont pour but de proposer de nouvelles expériences, une expérience plus naturelle et plus immersive.

Coté immersion, la 3D n’est pas nouvelle et existe depuis des années.

Lunette 3D anaglyphes

Il permet de voir la profondeur des scènes et des objets et ainsi améliorer le réalisme et l’immersion.

 

La réalité virtuelle :

Heureusement il y a eu pas mal de progrès et on a maintenant des écrans suffisamment petits et avec une résolution suffisamment élevée afin d’être intégrée à un casque ou des lunettes.

On voit donc maintenant de nouveaux périphériques appelés « Casques Virtuels ».

Ce casque présente à l'utilisateur un univers 3D totalement virtuel, c’est à dire que l’on ne peut voir qu'au travers de l’écran et donc visualiser ce qui l’entoure. Ce type de solution est souvent présenté avec des vidéos 360 comme des visites virtuelles d’appartement ou des lieux touristique.

 

La réalité augmentée :

Il est possible depuis quelques années de filmer et d’incruster en temps réels des objets virtuels.

La démocratisation des smartphones a permis de remettre le sujet au goût du jour au travers d’applications à succès comme Pokémon Go.

Windows 10 vous permet de créer vous-mêmes des objets 3D avec Paint3D et vous pouvez les afficher à travers votre webcam grâce à l’application « Mixed Reality Viewer »

Windows 10 Mixed Reality Viewer

L’immersion est quelque chose de très importante pour se projeter et adhérer à ce type de technologie. Pour avoir tester ces approches, ce que l’on peut regretter, c’est le manque de réalisme sur les mouvements, ce qui permet à notre cerveau de croire et de se projeter dans l’environnement proposé.

 

Windows Mixed Reality a pour but de proposer un environnement plus immersif, c’est-à-dire de proposer une experience où la distinction du monde physique et virtuel disparaît. Cela est rendu possible, entre autres, grâce à l’intégration de nouveaux capteurs qui analysent l’environnement extérieur est permet ainsi de proposer une plus grande fidélité dans les mouvements.

Notre produit phare pour démontrer cette nouvelle approche est HoloLens, l’ordinateur holographique autonome.

HoloLens permet de superposer des éléments virtuels à des éléments physiques qui sont dans le champs de vision de l’utilisateur (oui oui comme l’hologramme de Leila dans Star Wars). On propose alors une fusion des 2 mondes.

HoloLens propose plusieurs expériences en fonction de l’usage et de l’application.

On peut en distinguer 3 principales :

  • Afficher des éléments 2D dans l’environnement de l’utilisateur (3D) comme une fenêtre Skype ou l’application photo
  • Afficher des éléments 3D qui interagissent avec l’environnement physique comme dans le jeu roboraid
  • Afficher uniquement un environnement virtuel comme des lieux touristiques dans Holotours

 

Aux scénarios HoloLens, on peut ajouter la nouvelle expérience des casques certifiés « Windows Mixed Reality » qui avec leurs écrans et capteurs embarqués permettent de proposer une expérience réaliste et immersive.

Ci-dessous un dessin qui résume les différentes expériences et immersions disponibles.

Mixed Reality headset

Avec la sortie de Windows 10 Fall Creator Update ou 1709, nous avons mis en place un partenariat avec des OEM comme Dell, Lenovo, Asus, Acer ou HP afin de proposer des casques immersifs qui incluent écran et capteurs.

Ce qui change par rapport aux anciens modèles est le fait qu’ils possèdent des capteurs de mouvements qui découlent des technologies utilisées dans la Kinect et dans HoloLens. Ces capteurs permettent de reproduire de façon fidèle les mouvements du corps dans l’environnement virtuel (orientation de la tête, déplacement du corps à travers la pièce).

Les casques ne sont pas autonomes, ils nécessitent d'être connectés à un PC.

Pour replacer le Bureau que l’on est habitués à utiliser avec notre écran et nos applications 2D, Windows 10 propose la « Cliff House”

La Cliff House est une maison virtuelle que vous pouvez personnaliser et qui vous permettra de créer votre univers et y placer les objets et applications virtuelles.

Ci-dessous on voit dans une des pièces un ensemble de fenêtre épinglées au mur comme le calendrier. Il est possible de consulter l’information ou d’exécuter l’application afin de partir dans un nouvel environnement.

Cliff House

On peut penser dans un 1er temps que c’est surtout pour le consumer mais cela n’est pas exact, de nombreuses entreprises s’intéressent de très près à ces technologies afin d’adresser des usages métiers comme la formation ou des outils d’aide à la vente.

Nous allons justement parler plus en détails des scénarios d’entreprises avec HoloLens.

 

Mixted Reality avec HoloLens

Grâce à des innovations technologiques décritent dans l’article de mon collègue, HoloLens permet de proposer une expérience de réalité mixte immersive et autonome. Il se distingue majoritairement sur le fait qu’il est capable d’ajouter des éléments au champ visuel de l’utilisateur et qu’il est totalement autonome (pas de PC connecté) ce qui permet une grande liberté de mouvement.

Un nombre illimité de scénarios existent et peuvent être créé, on va couvrir quelques exemples.

 

  • La formation et l'accompagnement

HoloLens permet la mise en place de nouveaux scénarios de formation qui rendent l’expérience plus proche du réel et permet de proposer de l’accompagnement en temps réel.

Schneider présente un très bon cas concret où l’agent de maintenance ne connaît pas forcément les tâches à effectuer. Chaque tâche sera alors affichée directement sur l’objet à intervenir afin de supprimer toute mauvaise compréhension et donc d’erreur de manipulation.

HoloLens Schneider Premset

Un autre exemple est « HoloCrane », qui présente très bien l’avantage de cette vision mixte sur de la simulation entraînement de nouveaux équipements ou lors de scénarios complexes.

L’approche d’accompagnement sera aussi bientôt réalisé dans un bloc opératoire. Le mois prochain un chirurgien utilisera HoloLens afin d’accéder à des images 3D des examens du patient ainsi que de proposer une collaboration avec des experts qui ne seront pas présents dans le bloc.

 

  • Aide à la conception

HoloLens permet d’aider les créateurs, développeurs et designers sur leurs travaux. Ils peuvent mettre en contexte leur création devant eux ou directement sur un objet existant.

Ci-dessous, des personnes qui travaillent sur le design des voitures Ford.

La superposition des images virtuelles sur des objets physiques permet de beaucoup mieux comprendre le résultat final. Cette approche permet de réduire significativement les coûts et le temps de développement car il n’est plus nécessaire de créer des prototypes à chaque idée ou modification.

Un autre très bon exemple est l’application « SketchUp Viewer » qui permet de visualiser les maquettes 3D de projet immobilier.

  • Aide à la vente

HoloLens est un nouvel outil qui va permettre aux vendeurs de proposer une experience aux clients qui ne pouvait être fait avant. Par exemple la vente d’appartement d’immeuble qui n’est pas construit ou en cours. Cette video montre comment le client va pouvoir personnaliser l’appartement qu’il souhaite acquérir et se projeter comme s’il y était déjà.

Les constructeurs comme Volvo ou Audi utilise HoloLens afin de pouvoir présenter des véhicules dans des espaces restreint comme des centres commerciaux mais aussi pour promouvoir et expliquer les innovations technologiques incluses dans leurs véhicules.

Votre scénario

Tout est possible et vous pouvez réaliser votre projet de différentes façons :

  • Développer vous-même les applications grâce aux ressources et vidéos disponibles.
  • Nous contacter afin de vous accompagner sur le développement
  • Demander à un de nos nombreux partenaires de développer ou de vous accompagner sur votre application.

 

Immersion

L’immersion est très importante dans de nombreux scénarios que ce soit dans le monde consumer comme le jeu mais aussi dans le monde de l’entreprise (BtoB ou BtoC).

L’immersion permet à l’utilisateur d’avoir le côté réaliste ce qui est très important lors du processus de design ou de vente mais permet aussi de captiver la personne dans des scénarios de jeu.

J’ai eu la chance de pouvoir tester les différentes solutions et ma préférence va pour l’HoloLens, avec lequel j’ai eu une immersion complète au point que lorsque je les retire, mon cerveau pendant une fraction de seconde ne comprend plus pourquoi l’objet virtuel a disparu de la pièce.

 

Conclusion

J’espère que le sujet est maintenant plus clair pour vous. Ce qu’il est important de noter c’est que tout cela n’est pas une vision des années futures mais bien des technologies qui existent aujourd’hui.

Les casques de réalité Mixte ainsi que le HoloLens sont en vente !!!

HoloLens est disponible en France est peut-être acheté sur le site Microsoft Store.

Les casques de réalité mixte d’Acer, HP, Lenovo, Dell, Asus sont disponibles ici et vous pouvez tester si votre ordinateur est compatible avec l’application « Windows Mixed Reality Check » ou vous pouvez acheter un PC certifié.

 

 

Liens :

Pour plus d ’informations sur “Windows Mixed Reality” : https://www.microsoft.com/fr-fr/windows/windows-mixed-reality

Vidéo : https://www.youtube.com/watch?time_continue=486&v=xTT_3DhTMI8

Prérequis hardware du PC : https://support.microsoft.com/en-us/help/4039260/windows-10-mixed-reality-pc-hardware-guidelines


Azure Tableストレージに格納されたゲストOS診断データを時間指定で削除する

$
0
0

こんにちは、Azureサポートチームの三國です。
今回は、Azure Tableストレージに格納されたゲストOS診断データを時間指定で削除する方法についてご案内します。

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

はじめに


Azure Table ストレージには、表形式で情報を格納できます。たとえば、仮想マシンの"ゲストOSの診断"情報を格納できます。格納は仮想マシン作成時などに設定できます。

この"ゲストOSの診断"を利用することで、VMのパフォーマンス情報などを得ることができ、それはポータルから確認ができます。

 

"ゲストOSの診断"データは、たとえばストレージエクスプローラなどで下図のように蓄積されていることを確認できます。

さて、このデータはどれくらいの期間保管されるのでしょうか?

ストレージアカウントの診断ログアーカイブはリテンションというパラメータがあり、保管期間を指定することができます。

しかし、現時点では"ゲストOSの診断"データについては保管期間の管理ができません。
そのため、過去にさかのぼって診断データを確認できるメリットはありますが、保管しているデータの容量に対して課金が発生します。課金体系の詳細はこちらをご覧ください。

今回は、Tableストレージに蓄積されたゲストOS診断データについて時間を指定して削除するPowerShellスクリプトをご紹介します。
クラシックモデルについては、こちらで同機能の PowerShell のサンプルが公開されていましたが、ARM についてはサンプルはありませんでしたので、今回、公開する事にしました。

スクリプトのご紹介


以下、powershellスクリプトになります。.ps1形式で保存し、powershellとしてお使いください。

ClearOutAzureTableStorageEntity_ARM

必要な引数は以下です。

StorageAccountName: "ストレージアカウント名"
TableName: "テーブル名"
ResourseGroupName: "リソースグループ名"
StartTime: "削除開始時間"
EndTime: "削除終了時間"

 

StartTime(削除開始時間)、EndTime(終了時間)はGMT(グリニッジ標準時)で以下のフォーマットで指定してください。
mm/dd/yyyy hh:mm:ss AM(or PM)
例: 10/13/2017 06:00 AM
以下は実行例です。
PS > .scriptForBlog.ps1 -StorageAccountName saforblog -ResourseGroupName rgForBlog -TableName WADPerformanceCountersTable -StartTime "10/13/2017 06:30 AM" -EndTime "10/13/2017 06:33 AM"
(アカウント情報略)
saforblog
Deleting the entity whose ETag is 'W/"datetime'2017-10-13T06%3A30%3A34.4238714Z'"'.
Deleting the entity whose ETag is 'W/"datetime'2017-10-13T06%3A30%3A34.4238714Z'"'.
Deleting the entity whose ETag is 'W/"datetime'2017-10-13T06%3A30%3A34.4238714Z'"'
(中略)
Totally deleted 183 entities in 'WADPerformanceCountersTable' Table Storage.

 

ストレージエクスプローラを見ると下図のようになっています。6:30-6:33のデータが削除されていることがわかります。

以上です。
皆様のお役に立ちましたら幸いです。
なお、スクリプトのご利用にあたっては免責事項をご参照願います。

Azureで単数NICの仮想マシンを複数NICにする

$
0
0

こんにちは、Azureサポートチームの三國です。
今回は、単数NICの仮想マシンを複数NICにする方法についてご案内します。

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

はじめに


NICとは、ネットワークインターフェイスカードの略です。Azure Portalから仮想マシンを作成するときは単数NICがついている仮想マシンしか作成できないのですが、Azure PowerShellなどを使うことにより最初から複数のNICを持った仮想マシンを作成することができます。

最初から複数NICを持った仮想マシンを作成する方法は、下記のドキュメントをご覧ください。

複数の NIC を持つ Windows 仮想マシンの作成と管理

今回のテーマは、「後からNICの数を複数にしたい!」というケースです。
実は、2つのNICを持つ仮想マシンにもう1つNICを加えることは難しくありません。
本ブログにおいてこのケースは直接的には取り上げませんので、以下のドキュメントをご参照ください。
(なお、本ブログを読んでも方法については理解できるようになります)
それでは、早速中身に入りましょう。

単数NICの仮想マシンを複数NICにする


以下のPowerShellを用いてください。解説は後述します。

注意!!このスクリプトにより仮想マシンが停止(割り当て解除)されます。
注意!!このスクリプトでは新しいNICにNSGを割り当てません。割り当てたい場合はスクリプトをカスタマイズするか、NIC追加後にポータルより行ってください。

######################################################################
### ログインとサブスクリプション指定
######################################################################
Login-AzureRmAccount
$mySub = Get-AzureRmSubscription | Out-GridView -Title "Select an Azure Subscription ..." -PassThru
Select-AzureRmSubscription -SubscriptionId $mySub.Id

######################################################################
###基本情報を設定する
###なお、パブリックIPアドレスの割り当ては任意です
######################################################################
$resourceGroup = "リソースグループ名"
$location = "場所"
$vmName = "仮想マシン名"
$PublicIpAddressName = "パブリックIPアドレス名"
$dnsNameforPublicIp = "パブリックIPアドレスに紐づくDNS名"
$vNetName = "仮想マシンの所属するVnet名"
$newNicName = "新しいNIC名"

$vm = Get-AzureRmVM -name $vmName -ResourceGroupName $resourceGroup

######################################################################
###仮想マシンを停止する
######################################################################
stop-azurermvm -name $vmName -ResourceGroupName $resourceGroup

######################################################################
###パブリックIPアドレスを作成する
###パブリックIPアドレスが不要であればコメントアウトください。
######################################################################
$pip = New-AzureRmPublicIpAddress -AllocationMethod Dynamic `
-ResourceGroupName $resourceGroup -DomainNameLabel $dnsNameforPublicIp `
-IpAddressVersion IPv4 -Location $location -Name $PublicIpAddressName

######################################################################
###新NICを作成する
###パブリックIPアドレスの割り当ては任意です。不要であれば、
###-PublicIpAddressIdオプションをコメントアウトしてください。
######################################################################
$myVnet = Get-AzureRmVirtualNetwork -Name $vNetName -ResourceGroupName $resourceGroup
$mySubnet = Get-AzureRmVirtualNetworkSubnetConfig -VirtualNetwork $myVnet `
|Out-GridView -Title "Select an Azure Subnet ..." -PassThru

$newNic = New-AzureRmNetworkInterface -Location $location `
-Name $newNicName -ResourceGroupName $resourceGroup `
-SubnetId $mySubnet.Id -PublicIpAddressId $pip.id

######################################################################
###新NICを追加する
######################################################################
Add-AzureRmVMNetworkInterface -VM $vm -Id $newNic.Id

######################################################################
###プライマリNICを指定する
###本手順では既存のNICをプライマリにしていますが、新NICをプライマリに
###したい場合は以下のようにしてください
###vm.NetworkProfile.NetworkInterfaces[1].Primary=$true
######################################################################
vm.NetworkProfile.NetworkInterfaces[0].Primary=$true

######################################################################
###仮想マシン情報を更新する
######################################################################
Update-AzureRmVm -ResourceGroupName $resourceGroup -VM $vm

######################################################################
###仮想マシンを起動する
######################################################################
start-azurermvm -name $vmName -ResourceGroupName $resourceGroup

 

解説


NICを単数から複数にするうえで重要なのは、"プライマリNIC"を指定するということです。
NICが単数の時はNICのパラメータに"プライマリ"という概念はありませんが、複数NICになると必要になります。
下記コマンドを単数NICの際と複数NICの際とでそれぞれ試してみるとご理解頂けるかと存じます。
(Get-AzureRmVM -name [仮想マシン名] -ResourceGroupName [リソースグループ名]).NetworkProfile.NetworkInterfaces
そのため、ただNICを追加するだけでなく、プライマリを指定してあげる必要があるのです。
最初から複数NICを持っている仮想マシンにNICを加えるときは、すでにプライマリNICが定められているため、NICの追加だけで十分です。
以上になります。
ご質問などありましたら、ぜひコメント欄へお願いします!
(その他、この記事を見に来た方に役立つかもしれないブログ)

Windows Server, version 1709 available for download!

$
0
0

What a great day! Back in June, we announced Windows Server was joining the Semi-Annual Channel release cadence to deliver innovation at a faster pace. Two weeks ago at Ignite, we announced Windows Server, version 1709, the first release in this new model, and today you can start using it!

Software Assurance customers can download Windows Server, version 1709 from the Volume Licensing Service Center (VLSC) portal. Azure customers can also deploy Windows Server, version 1709 based on the image in the Azure Marketplace. If you run virtual machines in a hosted environment, you can also check the images that your service provider made available.

Windows Server, version 1709 is only the first step in this new world of faster release cadences. The most important aspect of having new releases twice a year is customer feedback will shape the product. You can try the preview builds of Windows Server in the Semi-Annual Channel and provide feedback by joining the Windows Insiders program. You can also join the conversation in the Microsoft Tech Community where we have tons of professionals and experts sharing their learnings and answering questions.

To stay up-to-date with all the news around Windows Server, follow our Twitter and Facebook accounts!

Event ID 50 – NTFS Delayed Write Lost

$
0
0

We see often alerts in SCOM around "NTFS - Write Delayed Lost" which is worth to investigate further and here is a example how to look more close on this

From below KB:

"an event ID 50 message is logged if a generic error occurs when Windows is trying to write information to the disk. This error occurs when Windows is trying to commit data from the file system Cache Manager (not hardware level cache) to the physical disk. This behavior is part of the memory management of Windows. For example, if a program sends a write request, the write request is cached by Cache Manager and the program is told the write is completed successfully. At a later point in time, Cache Manager tries to lazy write the data to the physical disk. When Cache Manager tries to commit the data to disk, an error occurs writing the data, and the data is flushed from the cache and discarded. Write-back caching improves system performance, but data loss and volume integrity loss can occur as a result of lost delayed-write failures"

{Delayed Write Failed} Windows was unable to save all the data for the file :$I30:$INDEX_ALLOCATION. The data has been lost. This error may be caused by a failure of your computer hardware or network connection. Please try to save this file elsewhere.

Or

The driver detected a controller error on DeviceHarddisk1DR1.

If you open the eventlog from source system and filter for event 50 you will get the event details

In this example host SERVER1 is showing error code 0x80040032 which is IO_LOST_DELAYED_WRITE and status code is key here and this tells us 0xC000026E which equals STATUS_VOLUME_DISMOUNTED. Still not how this could happen but this is a way to better understand what is exactly causing this

more details on how decode the error / status can be found here ->

https://support.microsoft.com/en-us/help/816004/description-of-the-event-id-50-error-message

hope this helps,

Ramazan

Microsoft 365 Virtual Partner Summit

$
0
0

This last July Microsoft announced the rebranding of the Secure Productive Enterprise (SPE) to be called Microsoft 365 Enterprise. At the same time we announced a new member of the family, Microsoft 365 Business. All three of these plans share some common characteristics of helping your customers standardize on a desktop platform based on Windows 10, Office 365 for productivity, and EMS for management.

My peers have been posting resources to engage with Microsoft 365 including:

Today, I wanted to make sure you were aware of the Oct 18th (sorry for the late notice) Microsoft 365 Virtual Partner Summit:

Microsoft 365 Virtual Partner Summit (October 18, 2017 @ 8am PST)

We are excited to invite you to attend the very first Microsoft 365 Virtual Partner Summit. Join us live for a special 2-hour event to learn from experts across marketing and engineering as they unveil exciting news that will help you take advantage of the Microsoft 365 opportunity. Get a firsthand look at what resources and tools are coming your way to help you skill up and drive success with your customers.

Special guests include:

  • Ron Markezich, CVP Office 365 Marketing
  • Srini Raghavan, GM Skype & Teams Engineering
  • Stuart Cutler, Director Windows Partner Marketing
  • Diana Pallais, Director Office Partner Marketing
  • Parri Munsell, Director Office Partner Marketing

Register now: https://aka.ms/ALS237e-Reg

SDeming Face  Steve

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>