Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

Eine Mission, die verbindet – oder: Not special. Epic!

$
0
0

Schneejacke einpacken, Sonnenbrille vergessen und ab zum Flughafen. So ging es für mich letzte Woche von Berlin aus in den Süden Deutschlands – mit einem Ziel: Das Trainingscamp der deutschen Special Olympics Schneeschuhläufer in Inzell, im Chiemgau.  Auf diesen Besuch habe ich mich wirklich sehr gefreut: Berge, Schnee, hochmotivierte Athleten – das klang nach einer klasse Mischung und das Wochenende hat meine Erwartungen auf vielen Ebenen übertroffen.

Aber von Anfang an.

 

Der @hermsfarm und ich sind in den Schnee gefahren. Und dieses Wochenende trainieren wir mit der @specialolympics-Mannschaft der Schneeschuhläufer. Bei @microsoftdeutschland kann man das ganze Spektakel auch mitverfolgen! 💪🗻❄️ #gemeinsamstark #empowering #heartbeatfortheworld
📸 @rice_nelson

Ich habe Markus Herrmann, mit dem ich schon einige spannende Touren bestritten habe, mit eingepackt und dann ging es los. Flieger nach München und weiter mit dem Auto in Richtung Berge. Spätestens am verschneiten Chiemsee hat sich das Hauptstadtkind in mir gefragt, warum ich eigentlich nicht viel öfters ein Wochenende dazu nutze, um mal raus in die Natur zu fahren. Und mit jedem Kilometer wird es besser: Die Alpen bauen sich langsam vor uns auf, der Himmel ist strahlend blau – wow.

 

In Inzell lernen wir mit Martina Steinhäußer und Björn von Borstel zwei der drei nationalen Koordinatoren kennen. Sie führen uns in die Welt des Schneeschuhlaufens ein, erklären uns, wie sie Technologie von Microsoft nutzen und geben uns einen spannenden Überblick zu den Special Olympics. Ihre Mannschaft, die später Uwe Syksch als dritter Coach komplettiert, besteht aus acht Athleten mit geistiger oder Mehrfach-Behinderung. Bevor wir uns das erste Mal mit den Jungs und Mädels treffen, frage ich mich ein wenig, wie ich mich am besten verhalten soll. Ich möchte sie wertschätzen, aber auch nicht überfordern – viele Gedanken, die dann doch alle umsonst waren. Denn als wir ins Mannschaftshotel kommen, gibt es gleich ein High-Five von Anton Grotz, der mich mit seinem herrlich bayrischen Akzent und einem lockeren Spruch begrüßt und Daniel Weinert möchte als erstes wissen, was mein Lieblingsverein im Fußball ist – Inklusion kann so einfach sein. Nachdem wir uns kurz vorgestellt haben, geht es dann auch gleich in den Schnee.

 

Eine Mission, die verbindet: @microsoft ist weltweiter Technologie-Partner der @specialolympics. @marvin_ronsdorf und @hermsfarm sind dieses Wochenende zu Besuch im Trainingslager der deutschen Schneeschuhläufer und nehmen euch morgen mit in die Berge 🏔

 

Unter Flutlicht schnallen wir uns zum ersten Mal die Schneeschuhe unter die Füße und laufen uns warm. Danach wird gesprintet und die Spikes fressen sich durch den frischen Pulverschnee. „Ich bin noch nie auf Englisch gestartet.“ höre ich einen unserer Athleten murmeln und merke, zu wie viel Aufregung ein solches Detail führt. Aber das Training lohnt sich. Während einige am Anfang schon bei „racers ready“ gestartet sind, wird die Reaktionsgeschwindigkeit nach dem Startschuss am Ende immer besser – der Start bei den World Winter Games im März in Österreich ist also schon einmal kein Problem. Es wird dunkler, die Kräfte schwinden und die Sonne geht unter.

 

Samstag, blauer Himmel aber noch -8° Celsius. Kein Problem, denn wir sind gut eingepackt. Die High-Fives klappen noch besser als am Tag zuvor und heute steht richtig Action auf dem Programm: Sprints, Staffeltraining und Dauerlauf. Dafür hat Coach Uwe bereits eine 400m-Runde und eine Sprintstrecke abgesteckt.

 

Die Zeiten speichert Björn direkt über sein Surface Pro 4 in der Microsoft Cloud und hat so die Leistungsdaten seiner Schützlinge stets im Blick – und im weltweiten Vergleich. Wer wird schneller und wer lag beim Heimtraining zu sehr auf der faulen Haut? Dem Coach entgeht davon nichts. Und wie sieht so ein Sprint mit Schneeschuhen nun aus? Bitte sehr:

 

Die Athleten der @specialolympics legen beim Schneeschuhlaufen Distanzen zwischen 25 Metern und 10 Kilometern zurück. Unter dem Motto #heartbeatfortheworld finden die World Winter Games – @austria2017 – 14.-25. März 2017 in Schladming statt.

 

Zur Einordnung: Ich laufe Halbmarathon, kann durchaus gut sprinten und trotzdem schlägt mich Schneeschuh-Athlet Markus Protte, der auch den Sprint im Video gewinnt. Ich muss immer schmunzeln, wenn ich sehe, dass er am Ende etwas austrabt, wenn er, genau wie Usain Bolt, einige Meter vor der Ziellinie bereits genügend Vorsprung hat. Nach vielen Sprints, stärken wir uns alle im Forsthaus Adlgaß, das direkt neben der Strecke liegt und danach steht noch etwas Abwechslung auf dem Programm: rodeln.

 

 

 Kaum den germknödel gegessen, schon gibt’s eine rodelpause. so gefällt mir das! auch wenn team marventina allen davon gefahren ist. #specialolympics #gemeinsamstark #thermohermo

 

Ich bilde mit Valentina Beck ein Team, lege meine zweite Trainingseinheit damit ein, dass ich sie auf dem Schlitten den ganzen Berg hochziehe und dann geht es zu zweit in Höchstgeschwindigkeit wieder ins Tal. Unser Rodel-Trip hat Valentina dabei übrigens so viel Spaß gemacht, dass sie sich bei den Trainingsläufen am Nachmittag kurzer Hand weggeschlichen hat, um noch eine Runde allein zu fahren. Aber das kann man mit Martina & Co. natürlich nicht so einfach machen und so wartet unten am Berg schon ein Empfangskomitee auf unsere Ausreißerin.

 

 

Markus und ich bekommen noch einmal Einzeltraining von Anton und so langsam lassen bei allen Beteiligten die Kräfte nach. Noch ein paar Runden für die Athleten, die bei den World Winter Games auch längere Distanzen laufen und dann ist Schluss. Unsere Schneeschuhläufer bilden noch einmal einen Kreis und mit einer Extraportion Teamgeist wird der Tag beendet.

 

 

Am nächsten Morgen sagen wir vorerst „Auf Wiedersehen“. Denn wir verraten, dass wir uns schon in einigen Wochen wiedersehen, verschicken Fotos und zum Abschluss gibt es noch ein High-Five mit Valentina, das Bestnoten verdient. Dann war das Wochenende auch schon vorbei. Ich habe viele tolle Menschen treffen dürfen, die mit enormer Begeisterung und einem großen Herzen alles für ihren Sport geben. Dass dies überhaupt klappt, verdanken wir den Coaches, welche komplett ehrenamtlich agieren. Vor all der Zeit und dem Herzblut, das die Drei investieren kann ich nur meinen Hut ziehen und bin froh, dass es neben einem solch fantastischem Engagement Partner wie Microsoft gibt, die die Arbeit von Special Olympics unterstützen.

 

 

Bei bester Höhensonne geht das #gemeinsamstark-Trainingswochenende der @specialolympics-Schneeschuhläufer zu Ende. Auf ein Wiedersehen bei @austria2017.

Markus Herrmann und ich freuen uns jedenfalls schon, wenn wir „unsere Crew“ bei den World Winter Games im März in Österreich wiedersehen. Bis dahin bleiben wir #gemeinsamstark!

 

Ein Gastbeitrag von Marvin Ronsdorf
Blogger Falsche Neun & Digisport

 

marvin-ronsdorf

 

 


5 Fakten zu Office 365 aus der Microsoft Cloud Deutschland

$
0
0

„Sicherheit und Vertrauen sind die wichtigsten Themen beim Cloud Computing“ sagte unsere Deutschland-Chefin Sabine Bendiek zum Launch von Office 365 Deutschland in der Microsoft Cloud Deutschland.

Aus diesem Grund hat Microsoft bereits im September vergangenen Jahres zwei Rechenzentren in Magdeburg und Frankfurt/Main eröffnet und bietet seitdem die Infrastruktur- und Plattformdienste sowie die Software-Services von Microsoft Azure unter Datentreuhänderschaft von T-Systems an.

Seit 24. Januar 2017 ist nun auch Office 365 Deutschland in der Microsoft Cloud Deutschland verfügbar. Eine weitere Option für Geschäftskunden, Organisationen des öffentlichen Bereiches und des Bildungswesens. Alain Genevaux, Leiter der Office Business Group bei Microsoft Deutschland betont: „Das ist ein großer Meilenstein, um unsere Kunden bei der Digitalisierung zu unterstützen und datensensiblen Kunden die Nutzung von Office 365 zu ermöglichen“.

Was das konkret bedeutet, habe ich anhand der fünf wichtigsten Fakten rund um Office 365 Deutschland zusammengefasst.

1. Office 365 Deutschland bietet die vertrauten Produktivitätstools

Die cloudbasierte Produktivitätssuite Office 365 stellt ihren Nutzern leistungsfähige Werkzeuge für Kommunikation und Kollaboration bereit. Office 365 Deutschland bietet denselben Komfort und die vertrauten Office 365-Dienste: darunter die Desktopversion der kompletten Office-Suite, die mehr als 1,2 Milliarden Menschen in der ganzen Welt nutzen. Wer also schon Erfahrung mit Word, Excel, Outlook, OneNote und PowerPoint gemacht hat und nun wissen möchte, wie leistungsfähig Office 365 in der Zusammenarbeit und der Kommunikation innerhalb räumlich und zeitlich verteilter Teams ist, findet genau die Annehmlichkeiten, die er oder sie gewohnt ist.

Zu Office 365 Deutschland gehören neben den klassischen Anwendungen auch Skype for Business für die geräteübergreifende Zusammenarbeit, OneDrive for Business für das Teilen von Daten und Dateien, SharePoint für Kollaboration und Austausch von Wissen im Team und Exchange Online für die Kommunikation per Mail. Und auch Microsoft Project für effektiveres Projektmanagement und Microsoft Visio für die Visualisierung komplexer Informationen sind mit eingefasst.

Die Integration von Produktivitätsanwendungen mit Apps für Kommunikation und Zusammenarbeit ist eine der größten Stärken der Office-Suite:
„Ganz nach dem Motto ‚Alles aus einer Hand‘ entfallen so fehlerträchtige Schnittstellen zwischen unterschiedlichen Produkten, Herstellern und Cloud-Anbietern“, sagt Christian Glanz, Vorstand IT und Betrieb bei der Deutsche Vermögensberatung AG dazu.“


2. Office 365 Deutschland richtet sich besonders an datensensible Branchen mit strengen Datenschutz- und Compliance-Richtlinien.

Mit der Einführung der beiden neuen Rechenzentrumsstandorte in Magdeburg und Frankfurt/Main unterhält Microsoft weltweit mehr als 100 Rechenzentren in 40 Ländern – mehr als jeder andere Cloud-Anbieter auf der Welt. Microsoft hat die Kapazitäten seiner europäischen Cloud-Angebote allein im vergangenen Jahr verdoppelt und insgesamt mehr als drei Milliarden US-Dollar (rund 2,7 Milliarden Euro) in neue Rechenzentren in Deutschland, Österreich, Frankreich und Finnland sowie in bereits bestehende in Dublin und Amsterdam investiert.

Speziell die Investitionen in die beiden deutschen Rechenzentren unter Datentreuhänderschaft von T-Systems International GmbH (eine Tochtergesellschaft der Deutschen Telekom mit Sitz in Deutschland) helfen datensensiblen Unternehmen aus der EU, der Europäischen Freihandelszone EFTA und aus Deutschland. Dazu zählen zum Beispiel Organisationen aus dem Öffentlichen Sektor sowie der Medizin- und der Finanzbranche. Sie unterstützen die digitale Transformation solcher Unternehmen, da sie die Entwicklung und Bereitstellung von zukunftsfähigen Public-Cloud-Lösungen auch bei besonders strengen Anforderungen an Datensouveränität und Compliance ermöglichen. Entscheidend ist hierfür, dass die Datenhaltung ausschließlich in Deutschland erfolgt.

Noch einmal Christian Glanz von der Deutschen Vermögensberatung: „Überzeugend war für uns, dass die Microsoft Cloud Deutschland nicht nur die weltweiten Sicherheitsstandards der Microsoft Cloud erfüllt, sondern mit deutschen Rechenzentren und deutschem Datentreuhänder auf die besonderen Bedürfnisse vieler deutscher Unternehmen hinsichtlich Datenschutz und Compliance eingeht – das gilt auch für die hochsensiblen Daten der Deutschen Vermögensberatung.“

3. Das Datentreuhändermodell schließt die Weitergabe von Daten an Dritte ohne Erlaubnis des Kunden aus

Wir haben die Microsoft Cloud Deutschland mit einem Treuhändermodell ausgestattet, um die Daten unserer Kunden bestmöglich zu schützen.

Der Zugang zu den Kundendaten liegt beim Datentreuhänder T-Systems International: Ohne Zustimmung des Datentreuhänders oder des Kunden erhält auch Microsoft keinen Zugriff auf diese Daten. Wird diese Zustimmung durch den Datentreuhänder erteilt, greift Microsoft nur unter dessen Aufsicht sowie zeitlich begrenzt auf die Kundendaten zu.


4. Office 365 aus der Microsoft Cloud Deutschland mit Deutschem Support

 Für Office 365 Deutschland leistet Microsoft technischen Support rund um die Uhr an sieben Tagen in der Woche. Zu normalen Geschäftszeiten wird dieser Support aus Deutschland und auf Deutsch (wahlweise Englisch) gewährt, außerhalb der Geschäftszeiten findet der Support aus der EU statt.

5. Mit Office 365 Deutschland gibt es eigentlich kein Argument mehr gegen die Cloud

Die Verfügbarkeit von Office 365 Deutschland ist ein Meilenstein für Unternehmen, die bisher bei Cloud-Diensten zurückhaltend waren bei. Das bestätigt auch Dr. Eric Schott, Gründer und Geschäftsführer von Campana & Schott:

„Viele unserer mittelständischen Kundenunternehmen haben sich mit Blick auf interne Compliance bislang bewusst gegen die Vorteile cloudbasierter Lösungen entschieden. Für diese Unternehmen stellt die Microsoft Cloud Deutschland eine attraktive Möglichkeit dar, schnell, einfach und gemäß bestehender Vorgaben Cloud-Services zu nutzen. Gerade die Verfügbarkeit von Office 365 Deutschland wird für viele Unternehmen ein richtungsweisender Schritt sein, um den modernen und digitalen Arbeitsplatz künftig aus der Cloud zu nutzen.“

Die deutschen Rechenzentren nutzen die gleichen Technologien und bieten die gleichen Service-Level und Sicherheitsstandards wie die globalen Microsoft-Cloud-Angebote. Dazu gehören biometrische Scans, Smartcards, Datenverschlüsselungen nach SSL/TLS-Protokollen, physische Sicherheitsmaßnahmen, Sicherungen gegen Naturkatastrophen und Stromausfälle.

Wer über diese global gültigen Sicherheitsstandards hinaus sicherstellen will, dass seine Daten ausschließlich in Deutschland gespeichert werden, der nutzt Office 365 Deutschland.
Wir bei Microsoft sind überzeugt, dass die Cloud die technische Grundlage für die digitale Transformation unseres Landes ist. Vertrauen ist dabei ein wesentlicher Baustein auf den wir setzen. Deshalb haben wir die Sicherheitsbedenken deutscher Unternehmen nicht nur ernst genommen, sondern sind ihnen mit der Microsoft Cloud Deutschland auch begegnet.


Mehr zu Office 365 und Office Deutschland in unseren virtuellen Konferenzen am 14. und 21. Februar 2017

 Um das Angebot rund um Office 365 in der Microsoft Cloud Deutschland näher vorzustellen, bieten wir im Februar zwei virtuelle Konferenzen an. Hierbei wird es sowohl um strategische Fragen der Digitalisierung mit Office 365 Deutschland als auch um konkrete Anwendungsszenarien in mittelständischen und großen Unternehmen gehen. Ich würde mich freuen, viele Teilnehmer begrüßen zu dürfen!


Microsoft Office 365 in der Microsoft Cloud Deutschland
14. Februar 2017, 09:00 – 12:00 Uhr
Seit kurzem ist Office 365 in der Microsoft Cloud Deutschland verfügbar. In dieser virtuellen Konferenz – live übertragen aus der Microsoft-Zentrale in München – geben wir einen strategischen Überblick zu Office 365 Deutschland.

Zur Anmeldung https://info.microsoft.com/de-de-landing-DE-O365-WBNR-FY17-02Feb-14-Office-365-in-der-Microsoft-Cloud-Deutschland-288070.html?wt.mc_id=AID558536_QSG_135006

Arbeitswelt im Wandel für mittelständische und große Unternehmen
21. Februar 2017, 09:00 – 12:00 Uhr

Was bedeutet „Digitale Transformation“ für ein Unternehmen und für die tägliche Arbeit? Wir möchten dazu einige konkrete Antworten anhand von Kundenbespielen geben sowie auf Neuerungen wie Office 365 Deutschland eingehen.

Zur Anmeldung https://info.microsoft.com/de-de-landing-DE-O365-WBNR-FY17-02Feb-21-Arbeitswelt-im-Wandel-fur-mittelstandische-288428.html?wt.mc_id=AID558536_QSG_134990

 

Ein Beitrag von Ulrike Grewe
Product Marketing Manager für Office bei Microsoft
Ulrike Grewe

Microsoft System Center 2016 Management Pack for Microsoft Azure

$
0
0

 

image

 

The Management Pack for Microsoft Azure enables you to monitor the availability and performance of Azure resources that are running on Microsoft Azure

 

Details

Note:There are multiple files available for this download.Once you click on the “Download” button, you will be prompted to select the files you need.

image

 

    • The Management Pack for Microsoft Azure enables you to monitor the availability and performance of Azure resources that are running on Microsoft Azure. The management pack runs on a specified server pool and then uses Microsoft Azure REST APIs to remotely discover and collect performance information about the specified Microsoft Azure resources. This management pack focuses on the collection of performance metrics made available by Azure Services that use Azure Resource Manager. Azure Active Directory is used for authentication to the Azure REST APIs. This management pack queries the Azure REST APIs to enumerate the resources running in an Azure subscription and the performance metrics that are available for each resource. Virtual machines, web roles, and worker roles are able to store events and performance counters into Azure table storage using Azure diagnostics. If these resources are configured to use Azure diagnostics, this Management Pack can collect these events and performance counters. As soon as new Azure services are released (and older services are moved to the new Azure Resource Manager), they will be discovered automatically. As soon as the services expose performance counters via the metrics API, they will become available for collection. The MP guide includes information about what’s new in this version of the MP.
  • System Requirements

    Supported Operating System

    Windows Server 2012 Datacenter, Windows Server 2012 R2, Windows Server 2012 R2 Standard , Windows Server 2012 Standard

      This Management Pack requires System Center 2012 Service Pack 1 (SP1) – Operations Manager or System Center 2012 R2 Operations Manager or System Center 2016 Operations Manager. A dedicated Operations Manager management group is not required.
  • Install Instructions
      See the MP Guide for detailed instructions.
    Like always. you should read the Management pack Guide before importing it into Operations Manager. I am only providing some highlights.

Changes in this Management Pack

· Implemented automatic creation of the Service Principal Name

· Updated and redesigned Add Subscription and Add Monitoring wizards; fixed duplication of metrics for SQL Databases, fixed editing the Exclude list.

· Multi-factor authentication support is implemented

· Delay of Azure Application Insights alerts’ delivery is reduced.

· Fixed issue: virtual machines did not have “OS Version” property populated

· Fixed issue: in SCOM, successfully completed tasks could have “Failed” status in the Task Output.

· Improved the Add Monitoring wizard performance

· Fixed “Virtual Machine Turn Off Monitor” operational state names

· Updated the list of REST endpoints essential for Chinese instances

· For SPN mode, RDFE types are removed from the service types list

· Removed the need for SCOM Administrators to enter the password while creating/editing the monitoring configuration. Implemented an UI improvement that makes multiplication of the subscription administration easier: it provides a possibility to enroll multiple subscription using the same account without having to enter the credentials for every subscription

· Fixed issue: alerts were not delivered if the discovered entity had an alert in the past (before SCOM discovered the entity)

· Fixed issue: Azure Application Insights alerts were missing the links back to Azure portal

· Added a feature: if new Azure virtual machines are discovered or existing ones are removed, a corresponding alert is displayed in SCOM console

· Reduced the latency of event generation, when a condition occurs in Azure virtual machine

· Reduced the latency for virtual machines provisioned in Azure to show up in SCOM

· Implemented a task to query for inventory of all the Azure virtual machines

· Updated the display strings to match the recent changes.

Added a new section to the guide: “Appendix: Display Strings Changes History”

 

Management Pack Scope

The Management Pack for Microsoft Azure enables you to monitor the availability and performance of Azure resources that are running on Microsoft Azure. The management pack runs on a specified server pool, and then uses Microsoft Azure REST APIs to remotely discover and collect performance information about the specified Microsoft Azure resources.

This management pack focuses on the collection of performance metrics made available by Azure Services that use Azure Resource Manager.

Azure Active Directory is used for authenticating Azure REST API calls.

This management pack queries Azure REST APIs to enumerate the resources running in an Azure subscription and the performance metrics available for each resource.

Virtual machines, web roles, and worker roles can store events and performance counters into Azure table storage by means of Azure diagnostics. If these resources are configured to use Azure diagnostics, this Management Pack can collect these events and performance counters.

 

Prerequisites

You must manually ensure that the prerequisites are met.

The following requirements must be met to run this Management Pack:

· You must have an Operations Manager 2012 SP1 or later environment.
This Management Pack will not import on Operations Manager 2007 R2 or Operations Manager 2012 RTM.

· Due to certain performance issues, a separate management server should be dedicated for this Management Pack.

· All management servers in the management server pool must have connection to the Internet in order to communicate with Microsoft Azure.

· All management servers in your management server pool must have .Net framework 4.5 or newer installed.

The Management Pack has a monitoring rule, which detects .Net framework version on Management Servers. You can find Management Servers with .Net 4.5 missing by looking to “Active Alerts” view in “Microsoft Azure” folder.

· The workstation with the Operations Manager console, which will be used to configure Microsoft Azure monitoring, must have connection to the Internet in order to communicate with Microsoft Azure during the initial configuration process.

· The workstation with the Operations Manager console, which will be used to configure Microsoft Azure monitoring must have .Net framework 4.5 or newer installed.

· For collecting event and performance data from Azure VMs, Web roles and Worker roles, Microsoft Azure Diagnostics must be enabled.
For more information about Microsoft Azure Diagnostics, see
Collect Logging Data by Using Windows Azure Diagnostics article (http://go.microsoft.com/fwlink/?LinkId=186765).

· Microsoft Azure Diagnostics must be configured to forward diagnostic data to a Microsoft Azure storage. For more information about configuring Microsoft Azure Diagnostics, see Store and view diagnostic data in Azure Storage article (https://docs.microsoft.com/en-us/azure/cloud-services/cloud-services-dotnet-diagnostics-storage).

 

Mandatory Configuration

The Management Pack does not discover or monitor Microsoft Azure on import. In order to monitor Azure resources, you must perform initial configuration steps. This section explains how to configure Management Pack for Microsoft Azure to discover and monitor your Azure resources.

This section covers the following steps essential to monitor Azure resources:

1. Create an Azure Active Directory user, or application to be used by the Management Pack.

2. Add Microsoft Azure Subscriptions.

 

Security Configuration

Note that Microsoft Azure Run As Profile AD Credentials profile is configured automatically; when you add a subscription, there is no need to perform an additional configuration. As for Microsoft Azure Run As Profile Proxy, it is to be configured manually in case when large subscriptions (1000+ objects) are monitored. For more details, see Managing Run As Accounts and Profiles article.

Run As Profile Name

Associated Rules and Monitors

Notes

Microsoft Azure Run As Profile AD Credentials

Holds the Azure Active Directory Credential to authenticate with Azure.

Microsoft Azure Run As Profile Proxy

Holds the credentials needed to authenticate with a proxy to reach Azure

 

Low-Privilege Configuration

Azure Active Directory accounts can be configured to have read-only access to Azure Resource Manager resources.

To monitor older services (e.g. Mobile Services and RDFE resources), it is necessary to make the Azure Active Directory account used for monitoring a co-administrator on the subscription. You can find the corresponding instructions in How to add or change Azure administrator roles article (see “Azure classic portal” section).

 

Appendix: Known Issues

· Storage Account Performance view shows Virtual Machines and classic (Microsoft.ClassicStorage provider) storage accounts only.

· Some performance counters are not available until they have a non-zero value.

· The Distributed Application State view is empty by default. Create a distributed application to see the resources here.

· If SQL Azure database monitoring is enabled, Event Log at SCOM machine will be filled with EventID: 11422 entries and the following message: “Message: ParentResourceNotFound: Cannot perform requested operation on nested resource”, because the resource provider does not return metrics for master database. To avoid this problem, add ‘master’ database to the exclude list.

· No performance counters are available in Add Monitoring wizard for Mobile Services when the Operations Manager console has ran at a machine running Windows 8.1 or newer. Workaround: run the wizard from the machine running SCOM Server.

· For classic (Microsoft.ClassicStorage provider) storage accounts, metrics may not be collected. To resolve the issue, go to Azure Storage Services Versions page and select another storage service API version (the path is as follows: SCOM Console->Administration->Azure->Edit Subscription->Endpoint->Storage  service API version). Performance counters are not available for non-classic (Microsoft.Storage provider) storage accounts, even if metrics collection is enabled.

· Tags are not properly discovered for SQL Azure databases (Microsoft.Sql/servers/databases), because the resource provider does not expose tags in the corresponding /read operations.

· Sometimes Azure Management Pack may not be removed due to SCOM issue. To remove the MP manually, run the following code on the Operations Manager database:

exec p_TypeDeletePermanent ‘E2169E37-FF79-4877-5AFF-987AAF0F9DBF’

go

exec [dbo].[p_ManagementPackRemove] ‘C7B2E0B6-A068-544D-CF8F-B26A3B6DDC52’

go

· Custom Events and Performance counters for web and worker roles collected by Azure diagnostics for Virtual Machines (non-classic) are not supported.

 

· Azure Management Pack does not support proxy for Storage Grooming Probe, which is used by the following rules:

o Windows Azure NT Event Log Grooming

o Windows Azure Performance Counter Grooming

o Windows Azure .NET Trace Grooming

Therefore, Microsoft.WindowsAzure.Storage.CloudStorageAccount is used in this probe, and it does not support proxy in practice.

 

·  The state of insights alert rule incidents monitor may not reflect the immediate condition, as long as it changes the state depending on the information received from Azure event log.
In addition, the state of the alert rule switches to unhealthy only if the latest received event corresponds to opening of the alert. Therefore, upon initial activation of the monitor, the alert rule will remain in healthy state until the corresponding event is received. At that, the state of the monitor is displayed as healthy if the alert rule is disabled.

· Metrics for Mobile Services and Classic Virtual Machines are not supported in China subscriptions.

· The following services are not supported in China subscriptions:

o Non-Classic Virtual Machines

o Classic Virtual Machines (all collection rules are not supported)

o SQL Databases

o ARM Cloud Services

o BizTalk Services

o DocumentDB

o Notification Hubs

o Operational Insights

o CDN

· Adding new subscriptions may fail if monitoring templates from the previous management pack installations (if there were any) are not removed. Normally, they are removed automatically upon uninstallation of the management pack. Otherwise, remove those templates manually in order to avoid errors while adding the subscriptions.

· “Classic Virtual Machine Turn Off Monitor” does not work if the corresponding virtual machine is stopped via the old portal. Alternatively, it will not work if the virtual machine is stopped via the new portal and then started again via the old one.

· Properties of Microsoft Azure Generic Service class type objects are not populated. These objects are inherited from System.Service class; in this class, the properties exist but not populated by the management pack.

· “[Host]” metrics of virtual machines cannot be selected in the Add Monitoring wizard; these metrics are not returned by Azure Application Insights library. For classic virtual machines, these basic metrics are not [Host], and they are supported.

· “DocumentDB (NoSQL)” metrics are not available in the Add Monitoring wizard due to a problem with Microsoft.Azure.Insights dll library.

· The output of the “Obtain Service Types and Performance Counters Data” task may seem incomprehensible. It is an internal task and should not be run manually.

· “Change Number of Role Instances”, “Swap Staging And Production Slots”, “Start Deployment Slot”, “Suspend Deployment Slot” tasks may fail silently without providing you information about the failure. To resolve the issue, Azure Cloud services should be configured correctly. Create a cloud service instance and deploy at least one web/worker role. For “Swap Deployment Slots” task, you should configure two slots for Cloud service: staging and production.

· “ArmPerformanceCollectionProbe” module may throw “Period is less than timegrain” exception to the event log. It can happen if interval parameter value of the corresponding performance collection rule is overridden with a small value, or if timegrain parameter value of the metric is large. To avoid this error, make sure that the Interval value is greater than the timegrain value.

· Azure Application Insights performance metrics may not be collected by SCOM. To avoid this issue, remove “request.rate” from the metrics’ list in the management pack.

· Custom WAD metrics do not appear in the performance view, and there are no custom WAD events’ alerts. This issue is accompanied by “Atom format is not supported” warnings in the Event Log saying that requests to WAD Performance Counters and WAD Events tables failed. To avoid this issue, “Storage service API version” should be changed to an earlier date than it is set by default in order to make custom XMLs working (the path is as follows: SCOM Console->Administration->Azure->Edit Subscription->Endpoint->Storage service API version).

· “Redis Cache” metrics are not displayed in performance view, though they are present at the portal and selected in Add Monitoring wizard.

· “Restart” task does not start a virtual machine (whether it is classic or non-classic) if its status is “Stopped” or “Stopped (deallocated)”. Moreover, it is not recommended to use this task for virtual machines with the above statuses, as long as it may lead to monitoring issues.

· Upon upgrade of the management pack from version 1.3.22.0, monitoring may be broken and the following errors may occur in the Operations Manager event log: “Microsoft.SystemCenter.Azure.Modules.RoleInstanceStatusProbeModule.VirtualMachineStatusDS module type cannot be found.“ To resolve the issue, perform the following steps:

o Stop the Operations Manager Health Service.

o Remove the SCOM folder containing the cache (by default, the path is as follows: C:Program FilesMicrosoft System Center 2016Operations ManagerServerHealth Service State).

o Start the Operations Manager Health Service.

Setkání Windows User Group – únor 2017

$
0
0

Pravidelný přehled akcí sdružení Windows User Group pro měsíc únor. Proč je pomalá vaše aplikace a váš web? Může za to SQL databáze nebo prezentační vrstva? I taková témata nás v únoru čekají.

Ostrava: Monitoring a optimalizace databázových dotazů v Microsoft SQL Serveru 
9. února 2017, David Gešvindr
Zjistit důvod pomalosti aplikace a ověřit, zda za to může databáze. Stranou nezůstane ani systematický přistup vedoucí k identifikaci nejnáročnějších dotazů, které vytěžují váš SQL Server. Když už budeme vědět „co“ máme optimalizovat, zaměříme se na to, jak SQL Server načítá vaše data a jak tento proces můžete optimalizovat využitím vhodně navržených indexů a zefektivnit tak problematické SQL dotazy.

Praha: Zrychlování frontendu webových aplikací
14. února 2017, Martin Michálek
Podíváme se na hlavní důvody, proč rychlost řešit a ukážeme praktické tipy jak dosáhnout zrychlení. Od fungování prohlížečů, přes HTTP/2 po optimalizační techniky. Udělejte web pro návštěvníky přitažlivější i díky jeho rychlosti.

Brno: Windows Containers
21. února 2017, Marek Chmel
Pojďme společně nahlédnout na novou technologii ve Windows Server 2016 pro DevOps práci – kontejnery. Ukážeme si základní správu systému, který je postaven nad Docker enginem, instalaci a konfiguraci jednotlivých částí deploymentu. Vysvětlíme si jaký je rozdíl mezi Hyper-V a Windows kontejnery.

– Petr Vlk (WUG, KPCS CZ)

Tip o’ the Week 364 – F is for Function

$
0
0

clip_image001PC keyboards have always had Function Keys, just as mainframe terminals did before them – thank IBM for cementing F-keys on the modern keyboard, though. Even Apple Macs had function keys, though the latest fad is to replace them with a Touch Bar – among other things that were replaced.

Some terminal keyboards had up to 24 function keys, with the idea that different application would have various commands assigned to each. The modern multi-tasking, graphically-oriented operating system has largely done away with the clip_image002need for function keys, but certain commands persist and are supported widely – ALT-F4, for example, will pretty much always close a Windows application. CTRL-F4 will mostly close a window or tab.

F1 usually means “help”. F2 tends to rename the thing you’ve selected. F3 normally does a “search”. F5 usually refreshes whatever you’re viewing.  There’s more.

If you’ve a Surface Book, check out Paul Thurrott’s commentary from a while back, and if you’ve any other Surface device you might find the doubling up of function keys and other regular keys causes grief at times, as having the toggled “Fn” key locked on (so as to use the F-key functions) will nullify the other functions printed on the same keys. Losing access to the key that mutes your speaker or presses play/pause might be a minor annoyance, but forfeiting the Home, End and PgUp/PgDn keys can be a right pain if you’re editing text or moving around a spreadsheet. There’s no easy way of avoiding this, other than just being aware of whether you have the Fn key toggled or not.

Somewhat obtusely, Surface Book/Pro fans may not realise that the Fn key doesn’t just toggle on and off, but can be used in conjunction with other keys to provide spot functionality – the most useful being the Fn+Del and Fn+Backspace key combinations, which change the screen brightness up and down. Certainly more regularly useful than the keyboard brightness settings that share the F1 and F2 keys. This nugget was found in the Surface Book user guide, published along with guides for other Surface devices, here.

One of the best hidden function key combos to remember, though, is the F4 key within Office applications – it repeats the last thing you did, from colouring some text to lots of other stuff. If you’re applying formatting, for example, rather than using the Format Painter command in Office apps, you could simply set the format on one paragraph/cell/whatever, then select another one to apply the same formatting just by pressing F4, and you can continue to apply the same settings by selecting some more/pressing F4, etc. Magic.

SharePoint 2010 에서 Secure Store Service + ODC 파일을 사용하여 Visio Graphics Service 연동

$
0
0

SharePoint 2010 에서 Secure Store Service + ODC 파일을 사용 하여 Visio Graphics Service 와 연동을 구성 하는 내용을 설명 합니다.

Secure Store Service 의 내용은 SharePoint 2016 에서도 거의 동일하게 사용 가능한 부분이며,
ODC 파일 사용과 관련해서는 SharePoint 2016 에서는 Data Connection Library 가 별도로 필요 없는 구조로 변경 되었음을 참고 바랍니다.

 

[요약]

  1. SharePoint 의 Secure Store Service에 Data Refresh 에 사용할 계정을 설정한다.
  2. SharePoint 중앙관리 사이트에서 Visio Graphic Service 의 설정을 변경한다.
  3. Excel 을 사용하여 odc 파일을 생성하고 SharePoint 의 Data Connection Library 에 게시한다.
  4. Visio 에서 SharePoint 의 odc 파일을 이용하여 Data 연결을 하고 SharePoint 로 게시 한다.

 

[상세 설명-1]

  1. 중앙관리 사이트 > Application management > Manage service applications


     

  2. Secure Store Service 를 클릭하여 관리 화면으로 이동


     

  3. 기본 Key 를 생성 합니다.


     


     

  4. 계정 등록을 위해 New 를 클릭 합니다.


     

  5. Target Application ID 를 입력 하고 Type 을 Group 으로 설정 합니다. Next 를 두번 클릭 합니다.


     

  6. Administrator 와 Delegation 될 Member Group 을 설정 합니다.


     

  7. Credential 을 입력 합니다.


     


     

[상세 설명 – 2]

  1. 중앙관리 사이트 > Application management > Manage service applications


  2. Visio Graphics Service 를 클릭 합니다.


     

  3. Global Settings 를 클릭 합니다.


  4. 데이터의 캐시 시간을 적절히 설정하고, 외부 데이터의 응용 프로그램 ID 에 Secure Store Service 에서 설정한 Application ID 를 입력 합니다.


 

[상세 설명 – 3]

  1. 데이터 연결 파일 ( .odc ) 을 저장하기 위해 Data Connection Library 를 생성합니다.

    SharePoint 사이트에 관리자 권한으로 접속 하고, Site Actions > More Options 메뉴를 클릭 후 아래와 같이 Data Connection Library 를 생성 합니다.

     

  2. Excel 을 실행하고 Data > From Other Sources > From SQL Server 를 클릭 합니다.


     

  3. SQL Server 이름을 입력 합니다.


     

  4. 연동을 위한 Database 및 Table 을 선택 합니다.


     

  5. Authentication Settings  설정을 클릭 합니다.


     

  6. SharePoint 의 Secure Store Service 에서 생성한 Application ID 를 입력합니다.


     

  7. 생성한 odc 파일을 SharePoint 로 Export 하기 위해 Browse.. 를 클릭 합니다.


     

  8. 생성한 Data Connection Library 의 주소를 입력 하고 저장 합니다.


     

  9. Finish 를 클릭 후 OK 를 클릭 하여 저장을 완료 합니다.
  10. SharePoint Site 에서 저장된 odc 파일을 확인 하고 Approve 합니다.


     


 

[상세 설명 – 4]

  1. Visio 2010 을 실행 합니다.
  2. Data > Link Data to Shapes 를 클릭 합니다.


     

  3. Previously created connection 옵션을 선택 하고 next 를 클릭 합니다.


     

  4. SharePoint Data connection library 에 게시한  odc 파일을 선택 합니다.


     


     

  5. Next 를 클릭 하여 연결 정보 설정을 완료 하고 화면 아래의 External Data 정보와 link 를 원하는 객체에 Drag & Drop 하여 link 를 설정 합니다.


     

  6. SharePoint Document library 에 Visio 파일을 Web Drawing 형식으로 저장 합니다.


     

  7. 저장된 Visio 파일을 SharePoint site 에서 열게 되면 처음엔 아래와 같은 Refresh 관련 경고창이 발생합니다.

    이때 “Enable ( Always  )” 버튼을 클릭 하면 다음 부터는 경고창이 발생 하지 않습니다.

 

[주의 사항]

본 블로그에 게시된 정보의 내용 (첨부 문서, 링크 등)은 작성일 현재 기준이며 예고없이 변경 될 수 있습니다.
또한, 참고용으로만 제공됨으로 Microsoft에 책임이 없음을 알려 드립니다. 반드시 적용 전 충분한 테스트를 진행하시기 바랍니다.

Say Hi At Ignite 2017 Australia

$
0
0

For those of you heading along to Microsoft Ignite on the Gold Coast next week, drop by any of the four sessions I’ll be delivering during the week and say hello. When I’m not in one of these sessions I’ll either be near the hands on labs or dropping in to some of the sessions to make sure I learn something as well, or maybe I’ll be waiting patiently for you with Thomas from the OEM team at the Microsoft devices stand.

[

Tip of the Day: Security Updates Guide

$
0
0

Today’s Tip…

Microsoft has release a preview of a single destination for security vulnerability information. Check it out!

Rather than publishing bulletins to describe vulnerabilities, the new portal will let you view and search security vulnerability information in a single online database. The new portal enables you to do the following:

  • Sort and filter security vulnerability and update content, for example, by CVE, KB number, product, or release date.
  • Filter out products that don’t apply to you, and drill down to more detailed security update information for products that do.
  • Leverage a new RESTful API to obtain Microsoft security update information. This eliminates the need for you to employ outdated methods like screen-scraping of security bulletin web pages to assemble working databases of necessary and actionable information.

Security update information will be published as bulletins and on the Security Updates Guide until January 2017. After the January 2017 Update Tuesday release, Microsoft will only publish update information to the Security Updates Guide.

clip_image001

TechNet Blog: “Furthering our commitment to security updates” – https://blogs.technet.microsoft.com/msrc/2016/11/08/furthering-our-commitment-to-security-updates/

Old: “Microsoft Security Bulletins” – https://technet.microsoft.com/en-us/security/bulletins.aspx

New: “Security Updates Guide” –  https://portal.msrc.microsoft.com/en-us/security-guidance


Powershell – Get Domain Controllers Scheduled Task

$
0
0

Real quick post for the day. This script is designed to enumerate every Domain Controller in a forest and retrieve all the scheduled task.  Note this script will not work if you run it from Windows 2008 R2 or Windows 7.  You would need to change the script to use get-wmiobject instead. 

 

$default_log = $env:userprofile + 'Documentsdc_scheduletask_report.csv'
 
Import-Module activedirectory
 
((get-adforest).domains | get-addomain).ReplicaDirectoryServers | foreach {Get-Scheduledtask -CimSession $_ | foreach{$_taskpath = $_.taskpath;$_taskname =$_.taskname; $_taskstate=$_.state;$_.actions | select `
@{name='DC';expression={$_.PSComputerName}}, `
@{name='TaskName';expression={$_taskname}}, `  
@{name='TaskPath';expression={$_taskPath}}, `
@{name='State';expression={$_taskstate}}, `
execute, arguments, workingdirectory | export-csv $default_log -append -NoTypeInformation
}}

 

Link to Script Gallery

 

open the results in excel.

image

Select all the columns

image

Select Insert and Pivot Table

image

Select OK

image

In the pivot Fields drag the field down into the areas like this:

image

There is now a nice little report.

image

Look for instances where the grand total does not contain the number of all the DC’s in the forest.

image

Go and investigate the differences.  I hope you find this useful.  Have a good day.

-Chad

[EMS] EMSの評価ガイド公開

$
0
0

いつも Device & Mobility Team Blog をご覧いただきありがとうございます。EMS 担当の鈴木です。
本日は、最近公開されましたEnterprise Mobility + Securityのドキュメントを紹介します。

ドキュメントは以下よりダウンロードできます
https://www.microsoft.com/ja-jp/cloud-platform/products-Enterprise-Mobility-Suite.aspx

 

EMS評価ガイド

EMSの主要な機能をStep By Stepで評価するための評価ガイドが公開されました。
EMSはクラウド時代に必要とされるセキュリティ機能を網羅していますのでこの機会にぜひ実際に触ってお試しください。

Enterprise Mobility + Security スタートアップ ガイド
Enterprise Mobility + Security 評価ガイド
SPE 手順書シリーズ
SPE 手順書シリーズ 01 Azure Active Directory の導入
SPE 手順書シリーズ 02 不正アクセスの防止と安全なクラウドへのアクセス
SPE 手順書シリーズ 03 デバイス管理とメール セキュリティ
SPE 手順書シリーズ 04 Azure Active Directory による認証の強化
SPE 手順書シリーズ 05 Azure Information Protection によるデータ保護

 

EMS活用シナリオ

EMSを導入する際に簡単に導入でき効果の高いシナリオのドキュメントをまとめました。このシナリオは管理者の設定のほかユーザーの設定も必要なためそれぞれ手順書形式にしてあります。内容を編集してユーザーマニュアルとして利用いただくこともできます。

Azure Active Directory Premium セルフパスワードリセット シナリオ
Azure Active Directory Premium 多要素認証 シナリオ
Azure Active Directory Premium シングル サインオン シナリオ
Intune Mobile Application Management ポリシー運用 シナリオ

 

SaaSのAzureAD SSO対応

 SaaSサービスを公開しているパートナー様向けにAzure ADのシングルサインオン(SSO)連携するためのドキュメントをご用意しました。 国内でも続々とAzure ADとSSO連携するSaaSサービスが増えてきています。この機会にAzure ADと認証の連携をとれるように機能UPしてみませんか。

Microsoft パートナー向けのシングル サインオンの実装

 

ほかにも続々ドキュメントが公開されています。
お見逃しなく。

Malware findet neuen Weg, ihren Jägern zu entgehen

$
0
0

Security-Spezialisten haben eine neue Malware entdeckt, die sich auf besonders raffinierte Weise vor der Analyse in Sandbox-Systemen oder virtuellen Maschinen schützt. In einem Blog-Beitrag beschreiben die Fachleute einen Schädling, den sie aufgrund seiner verschachtelten Struktur nach den russischen Matrjoschka-Puppen benannt haben, Matryoshka Doll Reconnaissance Framework. Die Software zielte vermutlich auf Regierungsvertreter der Nato-Staaten ab und wurde zwischen Weihnachten 2016 und Neujahr 2017 in Umlauf gebracht.

Matryoshka wird verschickt als OLE-Objekt in einem RTF-Dokument. Dieses OLE-Objekt enthält ein Adobe-Flash-Objekt, das über ein ActionScript in einem ersten Schritt eine HTTP-Abfrage an seinen Command & Control-Server ausführt. Darin enthalten sind erste Informationen über den befallenen Rechner wie die Flash-Version und das Betriebssystem. Auf diese Weise kann sich der kriminelle Hacker einen Eindruck davon verschaffen, ob es sich vielleicht um eine Sandbox oder ein virtuelles System handelt. Falls ja, bricht er den Angriff ab.

Und weiter geht es…

Wenn er sich jedoch für eine Fortführung entscheidet und der Server eine entsprechende Meldung an das Script zurückgibt, führt es eine zweite Abfrage an eine weitere URL aus und lädt von dort ein komprimiertes und verschlüsseltes Adobe-Flash-Objekt, das es anschließend entpackt. Es folgt eine dritte HTTP-Abfrage, die eine Funktion zum Laden des Payload aufruft. Anschließend wird die verschlüsselte Malware entpackt und ausgeführt.

Nachdem der Schädling entdeckt worden war, ersetzten die Hacker den ursprünglichen Payload durch eine größere Menge Junk-Daten. Das geschah offensichtlich, um die Ermittlungen zu erschweren und Sandbox-Systeme zu täuschen.

Die Struktur des Virus lässt auf eine Gruppe hochgradig professioneller Angreifer schließen. Indem die Software das befallene System zunächst untersucht, bevor sie die eigentliche Malware lädt, ist der Eindringling schwer zu entdecken. Auch die schnelle Reaktion nach den ersten Funden des Virus lässt auf erfahrene Malware-Autoren schließen. Eine Infektion bei den attackierten Nato-Vertretern wurde bislang jedoch nicht bekannt.

Gastbeitrag von Michael Kranawetter, National Security Officer (NSO) bei Microsoft in Deutschland. In seinem eigenen Blog veröffentlicht Michael alles Wissenswerte rund um Schwachstellen in Microsoft-Produkten und die veröffentlichten Softwareupdates.

Announcing Public Preview of Windows Analytics: Update Compliance

$
0
0

We are pleased to announce Public Preview of Update Compliance through a suite of solutions called Windows Analytics.

Upgrade Analytics will move under the Windows Analytics suite and will be re-named to Upgrade Readiness. At the same time this blog, the former Upgrade Analytics blog becomes the Windows Analytics blog.

Update Compliance is a free service that provides enterprise customers and IT professionals with a holistic view of Windows 10 update compliance for the devices in their organizations.

As an IT pro, you can use Update Compliance to keep Windows 10 devices in your organization secure and up-to-date by having:

  • Information on the installation status of both monthly quality updates and new feature updates
  • Information on the deployment progress of existing updates with a preview of which updates are scheduled to be deployed next
  • Per-device information that may need attention to resolve issues

Update Compliance uses telemetry data including installation progress, Windows Update configuration, and other information to provide these insights at no extra cost and without additional infrastructure requirements. Whether used with Windows Update for Business or other management tools, you can be assured that your devices are properly updated.

Update Compliance is built using Operation Management Suite – Logs and Analytics. If you are new to Operations Management Suite (OMS), you can sign up and add the Windows Analytics bundle, which includes Update Compliance and Upgrade Readiness (formerly known as Upgrade Analytics). Existing OMS customers can simply add the Windows Analytics bundle. Usage of Update Compliance (Preview) is free and not counted towards any of your existing OMS subscription/quota or the Azure subscription/pay-as-you-go model.

There are a couple of steps you need to take in order to see your organization’s data in Update Compliance. These are described in detail in the Update Compliance documentation on TechNet. At a high level, you need to do the following:

New customers: create an OMS workspace and link the workspace to an Azure subscription. (If you do not have one, we recommend you get one with pay-as-you-go model.)

Existing OMS customers and Upgrade Analytics customers: add the Windows Analytics bundle (which includes both Update Compliance and Upgrade Readiness)

All customers:

  1. Subscribe to Windows Analytics solutions. (Go to OMS settings > Connected Sources > Windows Telemetry, and click Subscribe.)
  2. Copy the Commercial ID Key from Windows Telemetry tab in OMS and configure devices with the key you copied. (You can use Group Policy or MDM to configure.)
  3. Ensure minimum telemetry configuration on your devices is set to Basic.
  4. Ensure your network configuration allows devices to send telemetry data to Microsoft telemetry service endpoints.

 

Once the data is sent, it will typically show up in Update Compliance within 24 hours.

Here are a few samples of how Update Compliance (Preview) displays that data:

qualityupdatestatus

featureupdatestatus

deploymentperspective

You can follow this blog to get more information on any new updates we announce on Windows Analytics.

If you do have any questions or feedback, feel free to use the comments below.

Un paso más hacia la ciberseguridad

$
0
0

En el último año les hemos compartido noticias acerca del trabajo de los equipos de Microsoft para brindar una mayor seguridad a los usuarios, clientes y clientes de negocios en línea en todos aspectos.

security-blog-image_10022017

Por eso hoy podrán conocer acerca del desarrollo y los innovadores avances de algunas de las plataformas que ya conocen: Windows 10, Azure SQL y más. También se enterarán de las nuevas mejoras para Office 365, las cuales están enfocadas para brindar una mayor protección a su información. Además, sabrán un poco más sobre los partners con los que Microsoft se ha asociado para proveer mejores soluciones para cada uno de nuestros clientes.

Todos los detalles acerca de este nuevo paso que lleva a Microsoft un paso adelante hacia la ciberseguridad están en este artículo.

How to give us feedback

$
0
0

We love hearing from you.  So what’s the best way to give us feedback?

The best way to report an issue or give a quick suggestion is the Feedback Hub on Windows 10 (Windows key + F to open it quickly). The feedback hub lets the product team see all of your feedback in one place, and allows other users to upvote and provide further comments. It’s also tightly integrated with our bug tracking and engineering processes, so that we can keep an eye on what users are saying and use this data to help prioritize fixes and feature requests, and so that you can follow up and see what we’re doing about it.

In the latest build, we have reintroduced the Hyper-V feedback category.

After typing your feedback, selecting “Show category suggestions” should help you find the Hyper-V category under Apps and Games. It looks like a couple people have already discovered the new category:

 

Hyper-V feedback

When you put your feedback in the Hyper-V category, we are also able to collect relevant event logs to help diagnose issues. To provide more information about a problem that you can reproduce, hit “begin monitoring”, reproduce the issue, and then “stop monitoring”. This allows us to collect relevant diagnostic information to help reproduce and fix the problem.

Begin monitoring

We also love to hear from you in our forums if there are any issues you are running into. This is a good place to get direct help from the product group as well as community members. Hyper-V Forums

Hyper-V Forums

That’s all for now. Looking forward to seeing your feedback!

Cheers,
Andy

Using the Office 365 Secure Score API

$
0
0

The Office 365 Security Engineering team is pleased to announce the availability of the Office 365 Secure Score API. This API is fully integrated into the Microsoft Graph. If you are wondering what the Office 365 Secure Score is, get the low down here, or visit the experience here: https://securescore.office.com.

Why Collect Secure Score Data?

We think there are at least four possible business scenarios driving consumption of the Secure Score through an API:

  1. Monitor and report on your secure score in downstream reporting tools.
  2. Track your security configuration baseline.
  3. Integrate the data into compliance or cybersecurity insurance applications.
  4. Integrate Secure Score data into your SIEM or CASB to drive a hybrid or multi-cloud framework for security analytics.

Get The Data

Acquiring the secure score data from the API in a secure way requires you to setup a few pre-requisites.

  • First, you should choose your consumption model. The pre-requisites fulfill both requirements, but you’ll need slightly different implementations depending on your scenario.
  • Second, you will need to register your application in Azure Active Directory (AAD) in order to call the Secure Score API. The steps to create this application are below:
    • Navigate to AAD Portal, select your desired directory, click applications, click Add, Add a new application your org is developing:

registerapp

    • Pick an App Name, pick your implementation flavor:

nameapp

    • Pick a sign-on URL, and an App ID URI. You’ll need the former in your consent grant URL, and the latter isn’t used in our demo, but will need to be a proper URI format.

appproperties

  • When you click the check-box, you’ll get the configure page for your new application. There are only a couple steps left! Take note of the ClientID and the Sign-On URL in the upper portion.

configureapp

  • Note the Client ID, Setup a secret, add a new application at the bottom, select Microsoft Graph and then add Read all usage reports permissions to your application.

setupperms

grantconsent

Now that you have fulfilled all the pre-requisites, you are ready to pull the data! In the future, we’ll provide sample C# code samples, and links to partners that have completed integrations, but for now, we’ve worked up two quick PowerShell scripts that demonstrate the two authentication models. All sample code and documentation about the Secure Score API can be found here: https://github.com/OfficeDev/O365-Cloud-Sec-Tooling

Get Secure Score with Interactive Logon: https://github.com/OfficeDev/O365-Cloud-Sec-Tooling/blob/master/Securescore/GetSecureScoreFromAPI-ADAL.ps1

This script will install a local ADAL library from GIT, then do a local prompt for credentials. If your global admin account requires MFA, this implementation will respect that. You will need to populate your Client ID and Redirect URL from above in the function called “Get-AuthenticationResult”. The script currently dumps out the last 9 days of Secure Score results to the console. You can edit (a local copy of) the script to do anything else you like, including converting to CSV, or integrating into a different data store. Note that as of this publishing, the API is still in Beta. Come back in the near future to acquire an updated version of the script with the non-Beta URL.

Get Secure Score with Service-to-Service OAuth Application: https://github.com/OfficeDev/O365-Cloud-Sec-Tooling/blob/master/Securescore/GetSecureScoreAPI-S2S.ps1 and https://github.com/OfficeDev/O365-Cloud-Sec-Tooling/blob/master/Securescore/ConfigForSecureScoreAPI.json

This script requires you to populate the JSON file with your Client ID, your Application Secret, your Tenant Domain, and Tenant GUID. The PowerShell script imports the config file, acquires a token, then calls the API and retrieves the last 9 days of results. Note that as of this publishing, the API is still in Beta. Come back in the near future to acquire an updated version of the script with the non-Beta URL.

What is in the API?

The Secure Score API has one REST method:

https://graph.microsoft.com/v1.0/reports/getTenantSecureScores(period=1)/content

Where ‘period=X’ with X representing an integer value between 1 and 90, indicating the number of days historical data you wish to query from today’s date.

Prerequisites

One of the following scopes is required to execute this API: Reports.Read.All

HTTP request

GET https://graph.microsoft.com/v1.0/reports/getTenantSecureScores(period=1)/content

Optional query parameters

If no parameters are specified, default method with no parameters will return most recent 1 results.

Name Value Description
period Integer Integer indicating number of days of score results to retrieve starting from current date.


Request headers 

Header Value
Authorization Bearer .Required.
Content-Type Application/json


Request body

Do not supply a request body for this method

Response

If successful, this method returns a 200 OK response code version object and collection of score data objects for every Secure Score control in the response body.

Example

Request

GET https://graph.microsoft.com/v1.0/reports/getTenantSecureScores(period=1)/content

Response 

HTTP/1.1 200 OK

Content-Type: application/json; charset=utf-8

{

“value”:[

{

“tenantId”:”12bce6d0-bfeb-4a82-abe6-98ccf3196a11″,

“createdDateTime”:”2016-10-16T00:00:00+00:00″,

“licensedUsersCount”:28,

“activeUsersCount”:0,

“secureScore”:115,

“organizationMaxScore”:243,

“accountScore”:33,

“dataScore”:45,

“deviceScore”:37,

“enabledService”:[

“exchange”,

“lync”,

“sharepoint”,

“OD4B”,

“Yammer”

],

“controlScores”:[{

“AdminMFA”:[

{

“score”:”21″},

},

{

“maxScore”:”50″

},

{

“count”:”9″

},

{

“total”:”16″

}

],

[{

}]

}],

“averageSecureScore”:16.5588017,

“averageMaxScore”:237.017166,

“averageAccountScore”:3.69947028,

“averageDataScore”:12.7047329,

“averageDeviceScore”:0.154599056

}],

[{

}]

}


Enterprise Mobility + Security (EMS) Partner Community

$
0
0

TimTetrickPhoto

Tim Tetrick

 

Hello Microsoft Partners,

Build and extend your enterprise mobility practice, and help customers stay secure and productive on the apps and devices they choose to use. The Enterprise Mobility + Security (EMS) Partner Community is led by technical subject-matter experts, and activities include blog posts, community calls, and Yammer group discussions.

February 2017 community call and topic
New Microsoft Intune functionality on Microsoft Azure
Sign up for the February 23 partner call

January 2017 community call and topic
Integration of Lookout mobile security with Microsoft Intune
Watch the January call on demand
Read the introductory blog post

December 2016 community call and resources
EMS demo tools, resource sites, and funding programs
Watch the December call on demand
Read the introductory blog post
Introducing the EMS Partner Learning Plan

Recaps of community calls and resources
December communities recap
November communities recap
October communities recap
September communities recap

Related resources
EMS Partner presales technical and deployment services
Secure Productive Enterprise

Join the Enterprise Mobility + Security Partner Community
Sign up for community calls
Join the Yammer group
Blog series
Training and enablement
Video playlist

クラウド ソリューション プロバイダー プログラムを基盤にクラウド マネージド サービスを組み立てる【2/11 更新】

$
0
0

(この記事は2016年12月14日にMicrosoft Partner Network blog に掲載された記事 Build a cloud managed services practice, with Cloud Solution Provider as your foundationの翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

williamlewallen-authorblock

 

クラウド ソリューション プロバイダー (CSP) プログラムを利用して自社のクラウド サービスを販売しているパートナー様の数は、ここ数か月で 40% も増加しました。マイクロソフトのクラウド ビジネスは進化と成長を続けており、それに合わせてパートナーの皆様にも CSP について理解を深めていただけるよう、積極的に情報を提供していきたいと考えています。

 

マイクロソフトは、クラウド サービスに注力する皆様が成果を収め、収益力を高められるように、CSP のプログラム内容についてわかりやすいガイダンスを提供してきました。パートナー ポータルにはクラウド ソリューション プロバイダーのページを新しく追加し、ビジネス チャンスとは何か、プログラムに参加して収益力を向上させるにはどうしたらよいかをお伝えするべく、さまざまなリソースを公開しています。

クラウド ソリューション プロバイダー プログラムのページ

クラウド ソリューション プロバイダー プログラムに関する FAQ のダウンロード (英語)

 

クラウド マネージド サービスで収益力を強化

パートナーシップの収益力を高めるには、クラウド マネージド サービスを提供するのが 1 つの方法です。この分野には多大な関心が集まっていますが、クラウド マネージド サービスを提供するといっても、多くのパートナー様からすれば実態が見えにくく難しいことのように思えるかもしれません。そうしたイメージを払拭するために、CSP パートナー様専用の実践的なわかりやすいガイダンスとして「Azure マネージド サービス プレイブック」を作成し、先ごろ公開しました。このプレイブックは、クラウド マネージド サービスのメリットを説明するだけでなく「どのようなサービスをどのように提供すればよいか」に焦点を当てています。

 

先日のマネージド サービス プロバイダー向けカンファレンスでこのプレイブックを紹介したところ、大好評でした。さっそくご活用くださったパートナー様からは、クラウド サービス事業に具体的なメリットがあったという声を頂いております。

 

CSP パートナー様向け Azure マネージド サービス プレイブックのダウンロード (英語)

 

ぜひこの機会に、クラウド ソリューション プロバイダー プログラムによって広がるビジネス チャンスについて理解を深め、独自のクラウド サービスを組み立てましょう。マイクロソフトにとってもパートナー様にとっても、大きなチャンスが訪れています。クラウド マネージド サービスの成功を目指し、共に歩んでくださるパートナーの皆様に、この場を借りて心より感謝申し上げます。

 

クラウド サービスの関連リンク

クラウドに対するビジネスの成熟度を測るための 4 つのステップ

収益性に関するシナリオと財務モデル

クラウド対応ビジネス ツールキットの概要と準備状況評価 (英語)

顧客生涯価値モデリング ツールの紹介 (英語)

成長性や収益性を備えた知的財産 (IP) の構築 (英語)

 

MPN 101: クラウド ソリューション プロバイダー プログラムの概要とビジネス チャンス

オンラインで視聴 (英語)

 

 

 

MAP Toolkit 9.6 Now Available!

$
0
0

We are pleased to announce the availability of version 9.6 of the Microsoft Assessment and Planning (MAP) Toolkit. This release of the MAP Toolkit helps increase the agility and cost effectiveness of deploying the latest Microsoft technologies. MAP 9.6 is updated to inventory, assess and report Windows Server 2016 and perform readiness assessment for Windows Server 2016. It is also updated to support Accessibility with features like support in high contrast mode, tabular data for charts, logical tab movements and consistent icons. Please refer detailed update available here. It also helps organizations assess their environment for Windows 10, SQL Server 2016, Office 2016 and Office 365, track usage of Windows Servers and preparing a migration to Windows Azure virtual machines.

Latest version of MAP Tool Kit 9.6 for users are available here.

You asked, We listened

As part of the MAP team’s ongoing improvement initiatives for this release, we reviewed a number of feedback submissions and implemented changes based on the MAP community’s suggestions.  We’d love to hear more. Please send comments and suggestions to mapfdbk@microsoft.com.

 

Creating an Ubuntu Server 16.10 specialized image on Azure and Deploying a new VM based on this image using Azure CLI 2.0

$
0
0

Hello Everyone,

This step by step guide will show you how to prepare a specialized image from an Ubuntu 16.10 image from the marketplace, using the newest managed disks that were released on 02/08/2017.

 

Specialized virtual machines are nothing more than a virtual machine installed from the Azure Marketplace then manually customized and prepared to become an image, if it is based on Windows, that means executing Sysprep as a final step, if it is Linux-based, that means de-provisioning the Linux agent (the Azure agent) as a final step.

 

In order to follow this guide, please deploy a new Ubuntu 16.10 virtual machine using managed disks instead of unmanaged disks (the old way with storage accounts), trough the portal or any other deployment method, for this example, I customized the image by installing XRDP on it, by following the steps outlined in this document.

 

It is also assumed that you have Azure CLI 2.0 installed in a management computer by following the steps outlined here to perform the installation, all command line instructions below assumes you are executing them on Linux bash only.

 

Creating the specialized image

 

  1. After you have your master Ubuntu virtual machine configured the way you need, please execute the following commands in order to make it available to be used as an image, this needs to be accomplished inside your virtual machine.
    sudo waagent -force -deprovision
    export HISTSIZE=0

  2. From a management computer (where Azure CLI 2.0 is installed) shutdown your specialized virtual machine
    az vm deallocate --resource-group linux-test-rg --name ubuntu-vm-01

  3. Generalize the virtual machine from Azure perspective
    az vm generalize --resource-group linux-test-rg --name Ubuntu-VM-01

  4. Still from the management computer, create an image from the specialized virtual machine, must be in the same resource group as your master virtual machine.
    az image create --resource-group linux-test-rg --name UbuntuMasterImage01 --os-type Linux --source Ubuntu-VM-01 --location eastus

Deploying a new VM from the specialized image using managed disks

  1. Obtain the image id
    az image list --query '[].{name: name, id: id}' 

    Result

    image

  2. Create the VM based on the specialized image
    az vm create 
        --image /subscriptions/<GUID>/resourceGroups/LINUX-TEST-RG/providers/Microsoft.Compute/images/UbuntuMasterImage01 
        --admin-username azureuser 
        --admin-password Test@2017-123! 
        --public-ip-address-dns-name tstubuntudemo01 
        --resource-group linux-test-rg 
        --location eastus 
        --name Ubuntu-VM-03 
        --vnet-name vnet 
        --subnet subnet01 
        --size standard_a1_v2 
        --os-type linux 
        --authentication-type password 
        --os-disk-name Ubuntu-VM-03-osdisk

Adjusting the automatically created Network Security Group

The specialization made to this image was enabling RDP service on Ubuntu, since the az vm create command also creates a network security group by default, we need to change it to allow port 3389 which is the one that RDP uses.

 

The following steps will guide you through adding a new security rule to the existing NSG:

  1. To obtain the NSG associated with this new virtual machine, first let’s get the ID of the network adapter
    az vm show --resource-group linux-test-rg --name Ubuntu-VM-03 --query 'networkProfile.networkInterfaces[0].id' --output tsv

    Result

    image

  2. Getting the network security group id
    az network nic show --ids /subscriptions/<GUID>/resourceGroups/linux-test-rg/providers/Microsoft.Network/networkInterfaces/Ubuntu-VM-03VMNic --query 'networkSecurityGroup.id' --output tsv

    Result

    image

  3. Getting the network security group name
    az network nsg show --ids /subscriptions/<GUID>/resourceGroups/linux-test-rg/providers/Microsoft.Network/networkSecurityGroups/Ubuntu-VM-03NSG --query 'name' --output tsv

    Result

    image

  4. Adding the network security rule to the existing NSG
    az network nsg rule create --resource-group linux-test-rg 
        --nsg-name Ubuntu-VM-03NSG 
        --name allow-rdp 
        --protocol tcp 
        --direction inbound 
        --priority 300 
        --source-address-prefix "*" 
        --source-port-range "*" 
        --destination-address-prefix "*" 
        --destination-port-range 3389 
        --access allow

Since XRDP does not come installed by default, this demonstrates that the new virtual machine came from the specialized image that you just created.

 

image 

 

I hope you have enjoyed this short post and have fun creating your own custom images in Azure.

 

For more information, please refer to the following documents:

 

Azure Managed Disks Overview

Create a Linux VM using the Azure CLI 2.0 Preview (az.py)

Add a disk to a Linux VM

Create a copy of a VHD stored as an Azure Managed Disk by using Managed Snapshots

Upload and create a Linux VM from custom disk by using the Azure CLI 2.0 (Preview)

 

See you in the next post!

 

Paulo

How to resolve the following issue: you re-enable an account of a returning employee. You assigned a license, but the mailbox is not provisioning.

$
0
0

A lot of customers open cases with Office 365 support saying: the mailbox is not provisioning. In this article I will explain how this can happen and what can we do to solve the situation.

We have the following scenario:
– we have a Hybrid Exchange Configuration, Local AD, local Exchange Server, DirSync, Office 365 tenant;
– we have a new employee;
– in Local AD we created a new user for it, synced it to the cloud;
– because the authority in in on-premises, we must create the mailbox from local Exchange Server (it is very important for free/busy, permissions and other features);
– we have 2 possibilities:
– create the mailbox in on-premises and migrate it to the cloud;
Enable-RemoteMailbox from the local Exchange Server to create the mailbox directly in cloud;
– now we have a user and a mailbox perfectly functional in a Hybrid Environment;
– from the local Exchange server we can verify the RemoteMailbox status:
costin-003**********

User left the company.
In this cases, the typical steps are:
– disable the user;
– backup the mailbox content and all other customer specific procedures;
– move the user in local AD to a OU reserved for “former employees”;

What will happen next?
Because the user was moved out of DirSync scope, the DirSync will not find the user in Local AD and will assume that we deleted it. The deletion will be synced into the cloud and MsolUser will be deleted. The user will remain in “deleted user” state for 30 days. “Cloud AD” = Azure AD and “Cloud Exchange” = Exchange Online (EXO) are synced between each other every 15 minutes. If EXO will see that the user is “deleted”, will update the mailbox and will bring it in a soft-deleted state. If the mailbox is placed on hold with Litigation Hold, the mailbox will enter in an Inactive State.

More than 30 days passed, Azure AD user is deleted permanently, the mailbox will be purged (unless placed on hold), assigned license is back in available licenses pool.

**********

But, HEY, our departed employee is back in the business and wants his accounts back. What should we do?
We will enable his account in Local AD, will move his account back from “former employees” to the OU where he belongs, we sync him to the cloud, we assign him a license … and  …. SUPRISE: a new cloud mailbox is not provisioned.

Why this happens?

When the user was moved in local AD from a synced OU to a non synced OU, he retained all his exchange properties and ExchangeGuid shows that he has a mailbox:
costin-004Even if the mailbox from the cloud was soft-deleted and then purged, the change was not synced in local AD. After the local user was re-enabled and moved in proper OU, he will be synced into the cloud:
costin-005The CloudExchangeRecipientDisplayType shows us: -2147483642 = SyncedMailboxUser.

The mailbox will not provision. Why? Because cloud sees that on-prem user has an ExchangeGuid and that means that we already have a mailbox. What should we do next? We must follow the steps:

– remove customer from DirSync scope;
– force the DirSync > MsolUser from the cloud will be deleted;
– run: Get-MsolUser -UserPrincipalName cloud-mod-test1@contoso.com ### should return error like:

Get-MsolUser : User Not Found. User: cloud-mod-test1@contoso.com.
At line:1 char:1
+ Get-MsolUser -UserPrincipalName cloud-mod-test1@contoso.com
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo         : OperationStopped: (:) [Get-MsolUser], MicrosoftOnlineException

– Get-Msoluser -ReturnDeletedUsers ### cloud-mod-test1@contoso.com should be listed;
Remove-MsolUser -RemoveFromRecycleBin -UserPrincipalName cloud-mod-test1@contoso.com ### delete from RecycleBin;
– Get-MsolUser -ReturnDeletedUsers ### cloud-mod-test1@contoso.com should not be listed. Now we are sure that MsolUser was completely deleted;
– wait 15 minutes;
– Get-MailUser -SoftDeletedMailUser ### should see cloud-mod-test1@contoso.com here;
– Remove-Mailuser -PermanentlyDelete cloud-mod-test1@contoso.com ### delete mailuser from soft deleted mailusers;
– verify with: Get-MailUser cloud-mod-test1@contoso.com and Get-MailUser -SoftDeletedMailUser cloud-mod-test1@contoso.com that mail user was deleted;
– go in on-prem Exchange Server;
– Get-RemoteMailbox cloud-mod-test1@contoso.com | FL Name, ExchangeGuid ### should see something like: b6e89b85-7279-450d-a84c-3c5ccc639c27
Set-RemoteMailbox cloud-mod-test1@contoso.com -ExchangeGuid “00000000-0000-0000-0000-000000000000” ### erase the proof that user have a mailbox;
– Get-RemoteMailbox cloud-mod-test1@contoso.com | FL Name, ExchangeGuid ### should see: 00000000-0000-0000-0000-000000000000
– move again the user in DirSync scope. Force DirSync;
– wait to see in cloud in Office 365 Admin that user appeared and assign license to it;

Mailbox should be created.

**********

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>