Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

認識全新Surface家族

$
0
0

 

  1. Surface Pen具備1,024級壓力靈敏度
  2. 分開發售
  3. 點擊預訂連結後,您將被要求透過電子郵件向Microsoft提供個人信息,以執行該服務。 如想了解更多有關香港Microsoft私隱政策的資訊,請瀏覽:http://www.microsoft.com/hk/privacy
  4. ERP – 預計零售價

Augment Image Metadata with Cognitive Services

$
0
0

Hello Folks,

Last Month we partnered with a local startup to create a solution to address the ever-growing problem we all have. How do we categorize and index all our digital pictures in such a way that we can search and retrieve based on picture content and not just based on date taken or location.

The Kwilt team is based out of Invest Ottawa. They are a startup with high-level goals for their applications, they currently have several apps to manage how people interact with their photos.
Their products are:

They’ve created an application that helps navigate the sea of photos located in social, cloud and messaging accounts. This is completed in a manner that is seamless to the end user as all the search work is completed through the Kwilt app. In a Mobile App and on the web.

image

During the project, we leveraged Azure Cognitive Services to augment the capabilities of the app. We introduced capabilities that will assist users facing the challenge of “tagging” all their photos to allow for more accurate searches. Capabilities that will address users mistyping tags that would result in no or wrong photos found. Address accents and special characters for search. i.e.: Montreal ≠ Montréal and more…

Technologies used in this project:

  • Azure Storage
  • Azure Functions
  • Microsoft Cognitive Services
  • Computer Vision API
  • Azure Cosmos DB (formerly DocumentDB)
  • Azure Search

Here is how we get this done!

image

And by the way the code we used is available here.

Ingest and Analyze

Ingesting all the data from Kwilt database in Azure Cognitive Services allows the service to automatically tag photos, and eliminating the need for manual user input. We started by building a robust/efficient workflow to push data from the Kwilt backend database to Azure in order to facilitate analysis by leveraging Cognitive Services.

You can test the capabilities of the service yourselves.

First the data is received from the Kwilt backend into an Azure Storage Queue. The process to feed the queue is a proprietary PHP script/process that extract new entries in the Kwilt database, converts them into a JSON format and sends them to the Azure Storage Queue using the Azure Storage PHP SDK to send messages to the configured Storage Account & Key.

Here is a sample message that is being stored in the queues.

{
    "sorting_time": "2015-06-07 22:50:36",
    "type": "image",
    "id": 68682364,
    "name": "010309_0800_6154_nals",
    "created_time": "2015-06-08 05:50:36",
    "width": 919,
    "height": 602,
    "mime_type": "image/jpeg",
    "size": 576761,
    "time_taken": "2015-06-07 22:50:36",
    "modified_time": "2015-06-08 05:50:38",
    "source_url": "https://farm1.staticflickr.com/333/18585231402_798c4247fe_o.jpg",
    "recent_time": "2015-06-07 22:50:36",
    "thumbnail_url": "https://farm1.staticflickr.com/333/18585231402_eac0b3fe77_z.jpg"
}

Once the Message is in the queue it triggered the Azure function. (below is a screen capture of the Azure Function configuration.) 

image6

Once the data is in the queue for analysis, we leverage Azure Functions
to send the info to Cognitive Services to analyze the image. Of  course,
since we want to utilize proper DevOps practices we have configured
Azure Functions for continuous integration from a Github repository
setup for the various functions.

var https = require('https');

module.exports = function (context, message) {
  logInfo(context, 'Analyzing Image Id: ' + message.id)
  logVerbose(context, 'Queue Message:n' + JSON.stringify(message));

  // Validate Configuration
  if (!process.env.OcpApimSubscriptionKey) {
    throwError(context, 'Missing Configuration, OcpApimSubscriptionKey not configured in Application Settings.');
  }

  // Validate Message
  if (!message.thumbnail_url) {
    throwError(context, 'Invalid Message, thumbnail_url missing.');
  }

  // Define Vision API options
  var options = {
    host: 'westus.api.cognitive.microsoft.com',
    port: 443,
    path: '/vision/v1.0/analyze?visualFeatures=Categories,Tags,Description,Faces,ImageType,Color,Adult&details=&language=en',
    method: 'POST',
    headers: {
      'content-type': 'application/json',
      'Ocp-Apim-Subscription-Key': process.env.OcpApimSubscriptionKey
    }
  };

  logVerbose(context, 'Thumbnail Url: ' + message.thumbnail_url);

  var req = https.request(options, function (res) {
    res.setEncoding('utf8');

    res.on('data', function (data) {
      logVerbose(context, 'Vision API Responsen' + data);

      var visionData = JSON.parse(data);

      // Was the image successfully processed
      switch (res.statusCode) {
        case 200: // Sucess
          updateFileMetaData(message, visionData);
          break;
        case 400: // Error processing image
          var errorMessage = visionData.message ? visionData.message : "Unknown error processing image";
          logInfo(context, errorMessage);
          updateFileMetaData(message, null, visionData);
          break;
        case 403: // Out of call volume quota
          context.done(new Error('Out of cal volume quota'));
          return;
        case 429: // Rate limit is exceeded
          context.done(new Error('Rate limit is exceeded'));
          return;
      }

      // Set the object to be stored in Document DB
      context.bindings.outputDocument = JSON.stringify(message);

      context.done();
    });
  });

  req.on('error', function (e) {
    logVerbose(context, 'Vision API Errorn' + JSON.stringify(e));
    throwError(context, e.message);
  });

  // write data to request body
  var data = {
    url: message.thumbnail_url
  };

  req.write(JSON.stringify(data));
  req.end();
};


function updateFileMetaData(message, visionData, error) {
  // Document DB requires ID to be a string
  // Convert message id to string
  message.id = message.id + '';

  // Keep a record of the raw/unedited Vision data
  message['azure_vision_data'] = {
    timestamp: new Date().toISOString().replace(/T/, ' ').replace(/..+/, ''),
    data: visionData,
    error: error
  };

  if (visionData) {
    // Flatten/append vision data to the file object
    message['isAdultContent'] = visionData.adult.isAdultContent;
    message['isRacyContent'] = visionData.adult.isRacyContent;
    message['auto_tags'] = extractConfidenceList(visionData.tags, 'name', 0.1);
    message['auto_categories'] = visionData.categories ? extractConfidenceList(visionData.categories, 'name', 0.1) : [];
    message['auto_captions'] = extractConfidenceList(visionData.description.captions, 'text', 0.1);
    message['auto_description_tags'] = visionData.description.tags;
    message['auto_dominantColorForeground'] = visionData.color.dominantColorForeground;
    message['auto_dominantColorBackground'] = visionData.color.dominantColorBackground;
    message['auto_accentColor'] = visionData.color.accentColor;
    message['auto_isBWImg'] = visionData.color.isBWImg;
    message['auto_clipArtType'] = visionData.imageType.clipArtType;
    message['auto_lineDrawingType'] = visionData.imageType.lineDrawingType;
  }

  // Convert existing tags field from comma seperated string to array
  if (message.tags && typeof message.tags === 'string') {
    message.tags = message.tags.split(',');
  } else {
    message.tags = [];
  }

  // Azure Search requires location to be a single field
  if (message.latitude && typeof message.latitude === 'number') {
    message['location'] = {
      type: 'Point',
      coordinates: [message.longitude, message.latitude]
    }
  }
}

function throwError(context, message) {
  logVerbose(context, 'Error: ' + message);
  throw new Error(message);
}

function logInfo(context, message) {
  context.log('+[Info] ' + message);

}

function logVerbose(context, message) {
  if (process.env.VerboseLogging) {
    context.log('![Verbose] ' + message);
  }
}

// Extracts a list of values by field from an array of objects
// where the confidence value is greater than or equal to the
// optional minConfidenceValue.
function extractConfidenceList(objArray, field, minConfidenceValue) {
  if (Object.prototype.toString.call(objArray) !== '[object Array]') {
    throw new Error("objArray (type: " + Object.prototype.toString.call(objArray) + ") in extractConfidenceList is not an array.");
  }

  if (!field || typeof field !== 'string') {
    throw new Error("field in extractConfidenceList is missing or not an string.");
  }

  // If not min confidence value specified or value is undefined set to 0
  if (!minConfidenceValue) { minConfidenceValue = 0; }

  var list = new Array();

  objArray.forEach(function (obj) {
    // Do we need to do a confidence check?
    if (minConfidenceValue > 0 && typeof obj['confidence'] === 'number') {
      // Is confidence >= min required?
      if (obj['confidence'] >= minConfidenceValue) {
        list.push(obj[field]);
      }
    }
    else {
      // No check needed push field into array
      list.push(obj[field]);
    }
  });

  return list;
}

Analysis of the images (visual features & details) is configured in the
function when the Vision API HTTP API call options are defined.

// Define Vision API options
  var options = {
    host: 'westus.api.cognitive.microsoft.com',
    port: 443,
    path: '/vision/v1.0/analyze?visualFeatures=Categories,Tags,Description,Faces,ImageType,Color,Adult&details=&language=en',
    method: 'POST',
    headers: {
      'content-type': 'application/json',
      'Ocp-Apim-Subscription-Key': process.env.OcpApimSubscriptionKey
    }
  };

In this case the following features were configured in English (only
English and Chinese are available for the API):

  • Categories– categorizes image content according to a taxonomy defined in documentation.

  • Tags– tags the image with a detailed list of words related to the image content.

  • Description– describes the image content with a complete English sentence.

  • Faces– detects if faces are present. If present, generate coordinates, gender and age.

  • ImageType– detects if image is clipart or a line drawing.

  • Color– determines the accent color, dominant color, and whether an image is black&white.

  • Adult– detects if the image is pornographic in nature (depicts nudity or a sex act). Sexually suggestive content is also detected.

Once the analysis is complete the resulting JSON message is stored in Cosmos DB.

Here is the result the processing of the following image:

image8

The following is the result of the analysis:

{
	"sorting_time": "2016-06-19 15:59:58",
	"type": "image",
	"id": "68772289",
	"name": "Photo Jun 19, 15 59 58.jpg",
	"created_time": "2016-07-28 15:54:59",
	"modified_time": "2016-07-28 15:55:16",
	"size": 2289041,
	"mime_type": "image/jpeg",
	"latitude": 46.8124,
	"longitude": -71.2038,
	"geoname_id": 6325494,
	"city": "Québec",
	"state": "Quebec",
	"state_code": "QC",
	"country": "Canada",
	"country_code": "CA",
	"time_taken": "2016-06-19 15:59:58",
	"height": 3264,
	"width": 2448,
	"thumbnail_url": "",
	"recent_time": "2016-06-19 15:59:58",
	"tags": [
		"town",
		"property",
		"road",
		"neighbourhood",
		"residential area",
		"street"
	],
	"account_id": 111193,
	"storage_provider_id": 105156,
	"altitude": 53,
	"camera_make": "Apple",
	"camera_model": "iPhone 6",
	"azure_vision_data": {
		"timestamp": "2017-03-29 15:59:55",
		"data": {
			"categories": [
				{
					"name": "outdoor_street",
					"score": 0.96484375
				}
			],
			"adult": {
				"isAdultContent": false,
				"isRacyContent": false,
				"adultScore": 0.007973744533956051,
				"racyScore": 0.010262854397296906
			},
			"tags": [
				{
					"name": "outdoor",
					"confidence": 0.9993922710418701
				},
				{
					"name": "sky",
					"confidence": 0.9988007545471191
				},
				{
					"name": "building",
					"confidence": 0.9975806474685669
				},
				{
					"name": "street",
					"confidence": 0.9493720531463623
				},
				{
					"name": "walking",
					"confidence": 0.9154794812202454
				},
				{
					"name": "sidewalk",
					"confidence": 0.8519290685653687
				},
				{
					"name": "people",
					"confidence": 0.7953380942344666
				},
				{
					"name": "way",
					"confidence": 0.7908639311790466
				},
				{
					"name": "scene",
					"confidence": 0.7276134490966797
				},
				{
					"name": "city",
					"confidence": 0.624116063117981
				}
			],
			"description": {
				"tags": [
					"outdoor",
					"building",
					"street",
					"walking",
					"sidewalk",
					"people",
					"road",
					"city",
					"narrow",
					"bicycle",
					"man",
					"group",
					"woman",
					"standing",
					"old",
					"pedestrians",
					"holding",
					"platform",
					"parked",
					"carriage",
					"riding",
					"train",
					"clock"
				],
				"captions": [
					{
						"text": "a group of people walking down a narrow street",
						"confidence": 0.8872056096672615
					}
				]
			},
			"requestId": "38fa30e6-2a50-4a7f-b780-e6472c6d1a52",
			"metadata": {
				"width": 600,
				"height": 800,
				"format": "Jpeg"
			},
			"faces": [],
			"color": {
				"dominantColorForeground": "Grey",
				"dominantColorBackground": "Grey",
				"dominantColors": [
					"Grey",
					"White"
				],
				"accentColor": "2C759F",
				"isBWImg": false
			},
			"imageType": {
				"clipArtType": 0,
				"lineDrawingType": 0
			}
		}
	},
	"isAdultContent": false,
	"isRacyContent": false,
	"auto_tags": [
		"outdoor",
		"sky",
		"building",
		"street",
		"walking",
		"sidewalk",
		"people",
		"way",
		"scene",
		"city"
	],
	"auto_categories": [
		"outdoor_street"
	],
	"auto_captions": [
		"a group of people walking down a narrow street"
	],
	"auto_description_tags": [
		"outdoor",
		"building",
		"street",
		"walking",
		"sidewalk",
		"people",
		"road",
		"city",
		"narrow",
		"bicycle",
		"man",
		"group",
		"woman",
		"standing",
		"old",
		"pedestrians",
		"holding",
		"platform",
		"parked",
		"carriage",
		"riding",
		"train",
		"clock"
	],
	"auto_dominantColorForeground": "Grey",
	"auto_dominantColorBackground": "Grey",
	"auto_accentColor": "2C759F",
	"auto_isBWImg": false,
	"auto_clipArtType": 0,
	"auto_lineDrawingType": 0,
	"location": {
		"type": "Point",
		"coordinates": [
			-71.2038,
			46.8124
		]
	}
}

Once this analysis is stored in the Cosmos DB it can be indexed and searched using Azure Search. as you can see in the following screen captures The Kwilt Team was able to digest the analysis and with Azure Search to build an extremely user friendly search proposition for their users.

In the project beta client we were able to search by keywords (Food, Plates, Fireworks…) and by location (Gatineau) without any manual tagging of the pictures.

image15

All I can say is that i cannot wait to process my own photo streams through this service.  I’ve already installed the app on my phone….

Cheers!!

clip_image003

Pierre Roman
@pierreroman

Updates for Surface Pro 3 (06 June 2017)

$
0
0

Today we’ve released updated drivers for the Surface Pro 3. This update includes new drivers for Surface Pro Embedded Controller Firmware and Surface Pro UEFI that improve battery life during sleep and PXE performance on IPv6.

These updates are available in MSI and ZIP format from the Surface Pro 3 Drivers and Firmware page in the Microsoft Download Center. Click Download to download the following files.

  • SurfacePro3_Win10_14393_1702002_0.msi
  • SurfacePro3_Win10_14393_1702002_0.zip
  • SurfacePro3_Win10_10586_1702002_0.msi
  • SurfacePro3_Win10_10586_1702002_0.zip
  • SurfacePro3_Win8x_9600_1702002_0.msi
  • SurfacePro3_Win8x_9600_1702002_1.zip

Note: In the name of each of these files you will find a Windows build number, this number indicates the minimum supported build required to install the drivers and firmware contained within. For example, to install the drivers contained in SurfacePro3_Win10_14393_1702002_0.msi, you must have Windows 10 Version 1607, the Anniversary Update, or newer installed on your Surface Pro 3. You can find a list of the build numbers for each version of Windows 10 in Windows 10 release information.

Note: Some earlier releases of Surface driver and firmware for Windows 10 do not contain a minimum build number in the file name. These files have a minimum supported build of Build 10240, Windows 10 Version 1507, RTM.

Surface Pro 3

  • Surface Pro Embedded Controller Firmware v38.13.50.0 improves battery life during sleep.
  • Surface Pro UEFI v3.11.2150.0 improves PXE performance on IPv6.

Inside Microsoft – ein Blick hinter die Kulissen

$
0
0

Spannende Geschichten, exklusive Events, Produkttests und direkter Austausch: Inside Microsoft ist unser neues Programm für alle, die von Windows, Office, Surface, HoloLens und weiteren Microsoft-Technologien begeistert sind.

Unsere User spielen eine bedeutende Rolle bei der Entwicklung unseres Unternehmens, denn sie inspirieren uns, unsere Produkte und Services kontinuierlich weiterzuentwickeln. Offenes Feedback und der direkte Kontakt sind dafür unerlässlich – aus diesem Grund haben wir Inside Microsoft ins Leben gerufen.

Was ist Inside Microsoft?

Inside Microsoft ist ein lokales Community-Programm und eine Austauschplattform für alle, die Microsoft-Produkte und Technologien nutzen.

 

 

Wir halten Euch mit News auf dem Laufenden, geben Tipps, verraten Tricks und freuen uns vor allem auf eure Kommentare zu den verschiedenen Themen.

Neben der Möglichkeit auf Produkttests und Ask-the-Expert-Sessions bieten wir auch regelmäßig spannende Events an, um einen Blick hinter die Kulissen zu werfen.

 

Microsoft Fan-Events in Berlin, Köln und München im Juni

Im Juni habt Ihr in Berlin, Köln und München die Gelegenheit bei einem dieser Events dabei zu sein. Im Fokus stehen die Themen Surface und Windows Ink und natürlich unser direkter Austausch mit Euch vor Ort. Für die Events können sich alle anmelden, die auf unserem Portal www.inside.ms registriert sind.

Die Surface-Familie hautnah erleben in Berlin

Am 13. Juni 2017 stellen wir Euch ab 18 Uhr in unserem Berliner Office unsere Surface-Produkte vor – inklusive dem neuen Surface Pro, Surface Laptop und Surface Studio.

 

 

Darüber hinaus erwarten Euch spannende Vorträge zur Surface-Familie sowie Windows 10 und natürlich die Möglichkeit auf Hands-on, um die neuen Geräte direkt vor Ort selbst auszuprobieren. Selbstverständlich stehen wir Euch persönlich für Eure Fragen zur Verfügung und freuen uns auf spannende Diskussionen mit Euch.

Hier geht’s zur Anmeldung.

 

Alles zu Windows Ink in Köln und München

In unseren beiden Offices dreht sich am 21. Juni 2017 – in Köln – und am 28. Juni 2017 – in München – alles um das Thema “Stift” und Windows Ink.

Gemeinsam mit unseren Partnern von notebooksbilliger.de und Cyberport stellen wir die Möglichkeiten der Stift- und Touch-Bedienung unter Windows 10 und Office 365 vor. Zudem wollen wir Euch unser neues Loyality-Programm “Microsoft Rewards” zeigen. Im Anschluss an unsere Präsentation habt Ihr selbst die Chance, einige Geräte ausführlich zu testen und auch eine Bewertung abzugeben.

Hier geht’s zur Anmeldung.

Ein Beitrag von Tobias Röver

 Social Media B2C  Lead Microsoft Deutschland

Hardware independent automatic Bitlocker encryption using AAD/MDM

$
0
0

Windows 10 delivers a “mobile-first, cloud-first” approach of simplified, modern management using cloud-based device management solutions such as Microsoft Enterprise Mobility Suite (EMS). This offers mobile users to be more productive regardless of location. At the same time organizations will require data to be safe, especially keeping 2018’s GDPR in mind. Most organizations require a form of disk encryption like Bitlocker.

In one of my previous blog posts you might have read about the requirement for InstantGo capable devices to automate Bitlocker configuration on the device and backup the recovery key to the user’s Azure AD account. Windows 1703, also known as the Creators Update, offers a wizard where users are prompted to start encryption regardless of the hardware used. I’ve received a lot of feedback regarding the need to automate encryption and not relying on end-users to do so.

Recently I received a few scripts that allow the triggering of a fully automated Bitlocker encryption process regardless of hardware capabilities. This is provided by DXC and based on previous work from Jan Van Meirvenne and Sooraj Rajagopalan – thanks for your work and willingness to share.

I’ve tuned the scripts a bit, wrapped them into an MSI – ready to be uploaded in Intune and deployed to a group of users.

How does this solution work?

The MSI attached to this blog does the following:

  • Deploys three files into C:Program Files (x86)BitLockerTrigger
  • Import a new scheduled task based on the included Enable_Bitlocker.xml

The scheduled task will run every day at 2PM and will do the following:

  • Run Enable_Bitlocker.vbs which main purpose is to call Enable_BitLocker.ps1 and make sure to run minimized.
  • In its turn, Enable_BitLocker.ps1 will encrypt the local drive and store the recovery key into Azure AD and OneDrive for Business (if configured)
    • The recovery key is only stored when either changed or not present

 

How can users get access to their recovery key?

This recovery key is written to two locations, both the Azure AD account and into a recovery folder in the OneDrive for Business (if configured). Users can retrieve the recovery key via http://myapps.microsoft.com and navigating to their profile, or in their OneDrive for Businessrecovery folder.

Azure AD:

OneDrive for Business:

Important to know:

  • After the script has run (most likely after 2PM) a reboot will be required before the initial Bitlocker encryption starts – users will be prompted.
  • Upload the MSI to Intune as a “Line-of-business” app. I’ve tested by deploying towards a User Group as “required” using the new portal at http://portal.azure.com.
  • The script doesn’t take any potential 3rd party encryption software into account. Only deploy this MSI to devices where Bitlocker will be the only disk encryption solution in place.
  • Please test this MSI on your own devices and in your own environment before broadly deploying.
  • Like mentioned earlier: the recovery key can be found in Azure AD, either by the tenant administrator or by the end-user at http://myapps.microsoft.com. The recovery key will also be stored in the OneDrive for BusinessRecovery folder.
  • This solution has only been tested on Windows 10 x64. You can even test on a Virtual Machine, as long as you assign a Virtual TPM.

A lot of time and testing has gone into this project, if it’s useful to you – please consider leaving a reply.

Downloads:

Hello world!

$
0
0

This blog site will provide guidance on the recommended practices for managing Active Directory Domain Services, Active Directory Federated Services, and Azure Active Directory.

512MB should be enough for anybody

$
0
0

A quick blog on Hyper-V and OS Deployment. I was scratching my head on an error while deploying an operating system through MDT. The error was from DISM in the bdd.log:

Error 14
Not enough storage space is available to process this command.

So, there are various fixes out there on the internet, from replacing media you’re installing from, or even replacing the hard drive you’re installing on (so much cheaper to do this in a virtual environment and less screws to turn).

Well, let’s look at Hyper-V’s settings for a moment. These settings are the same within Generation 1 and 2 VMs:

So, these 3 options:
RAM: This is the amount of memory that Hyper-V will allocate when the machine boots up, or some would call that at POST. In this case, 1GB.

Minimum RAM: This is the amount of memory that Hyper-V will shrink to, if it can. In this case, only 512MB. This is the big red flag.

Maximum RAM: This is the most total memory that Hyper-V will allocate to a VM (and you do want to limit this). In this case, not more than 2GB of RAM will be allocated to the VM.

Why is this important?
I have found in modern operating systems (especially Windows 10 and Server 2016) that it takes a minimum of 1GB really to boot and maintain a stable OS deployment, any less can be problematic. At the time of the DISM error, the VM had about 900MB allocated to it. So, if you are trying to test that deployment on a laptop with 8GB of RAM, you’re going to have to give up the resources to the VM.

The moral of the story, when deploying the more modern Windows from MDT or SCCM for that matter, set your VM at a minimum of 1GB. If you can swing it later, set the minimum RAM down after the deployment.

Changing individual quick action buttons via GPP

$
0
0

I was asked today if there was a way to change individual quick action items via GPO, and unfortunately, there isn’t. You can, however use Group Policy Preferences (GPP) to do this.

Take for example, switching to Tablet Mode. These are the 3 registry keys involved.

Path: HKCUSOFTWAREMicrosoftWindowsCurrentVersionImmersiveShell
Note that all of the below keys are of value REG_DWORD

Key: TabletMode
Values
0 = Off
1 = On

Key: SignInMode
Values
0 = Auto Switch to tablet mode
1 = Go to the desktop
2 = Remember what I used last

Key: ConvertibleSlateModePromptPreference
Value
0 = Don’t ask me
1 = Always ask me before switching
2 = Don’t ask me and always switch

Hope this helps someone else.


I’m back, but I’m not who I used to be

$
0
0

After an 11 year break – I’m coming back to blogging – but I’m not who I used to be.

I started this blog way back in 2005, when I was a SQL Server Evangelist. In 2006 I returned to Enterprise sales and I couldn’t the find time to properly share my learnings on this blog.

In 2010 I left Microsoft and followed the foot steps of Donald Farmer to join Qlik. If truth be known, I had begun to lose faith in Microsoft’s BI strategy and at the time Qlik was blazing a trail with it’s ‘Associative’ technology in the newly identified data discovery market . This technology made developing very competent analytical dashboards both a breeze and a pleasure.

But, in April last year I was tempted back to work in the Azure team as a Data Solution Architect, to help customers realise the benefits of Azure’s Data Services. Of course, it hadn’t escaped me that while I had been away Microsoft had released Power BI, which along with its close integration with the amazing advances in Azures data services, was clear evidence Microsoft was back on track with a coherent BI strategy and had become a serious player in the data discovery space.

So, right now I’m planning my first (proper) blog post, in which I will share my first experience with U-SQL in solving a customer’s challenge.

Matthew

Convert VMDK to VHD fails with the entry 2 is not a supported disk database entry

$
0
0

Hi All

I have been playing around with a PBX platform that runs only on VMWare and of course in vmdk format. I was keen to use it in Hyper-V so I needed to build in VMWare and convert to Hyper-V after the fact. Microsoft provide a VMDK to VHD/VHDX converter tool called the Microsoft Virtual Machine Converter. You can download it here https://www.microsoft.com/en-us/download/details.aspx?id=42497.

The nice thing about the tool is that you can convert using Powershell the vmdks you have. Anyway, when I ran the conversion cmdlet I got the error below.

PS E:VMDK> ConvertTo-MvmcVirtualHardDisk -SourceLiteralPath E:VMDKvm1.vmdk -VhdType DynamicHardDisk -VhdFormat vhdx -DestinationLiteralPath e:vmdkvhdx

ConvertTo-MvmcVirtualHardDisk : The entry 2 is not a supported disk database entry for the descriptor.

At line:1 char:1

+ ConvertTo-MvmcVirtualHardDisk -SourceLiteralPath E:vmdkvm1.vmdk – …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : WriteError: (Microsoft.Accel…nversionService:DriveConversionService) [ConvertTo-MvmcVi

rtualHardDisk], VmdkDescriptorParseException

+ FullyQualifiedErrorId : DiskConversion,Microsoft.Accelerators.Mvmc.Cmdlet.Commands.ConvertToMvmcVirtualHardDiskC

ommand

 

ConvertTo-MvmcVirtualHardDisk : One or more errors occurred.

At line:1 char:1

+ ConvertTo-MvmcVirtualHardDisk -SourceLiteralPath E:vmdkvm1.vmdk – …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : WriteError: (Microsoft.Accel…nversionService:DriveConversionService) [ConvertTo-MvmcVi

rtualHardDisk], AggregateException

+ FullyQualifiedErrorId : DiskConversion,Microsoft.Accelerators.Mvmc.Cmdlet.Commands.ConvertToMvmcVirtualHardDiskC

ommand

 

So not being a VMWare guy I had not much idea of what the error was. Luckily others have hit the issue before and I stumbled onto a workaround on stackoverflow. https://stackoverflow.com/questions/37481737/error-when-converting-vmware-virtual-disk-to-hyperv. The recommendation was to use a dsfok toolset to update the vmdk descriptor information on the file, to make the Microsoft VM Converter tool happy. Once doing this it worked a treat. I used the commands

C:scratchdsfkdsfok>dsfi.exe “e:vmdkvm1.vmdk” 512 1024 descriptor1.txt

OK, written 1024 bytes at offset 512

I then edited the descriptor1.txt file in Notepad++ and commented out the tools.InstallType attribute.

Then I needed to reapply the descriptor to the vmdk to make it convertible. You use the dsfi.exe tool to do that.

#After you edit the descriptor.txt file

C:scratchdsfkdsfok>dsfi.exe “e:vmdkvm1.vmdk” 512 1024 descriptor1.txt

OK, written 1024 bytes at offset 512

You can get the dsfok tools from here http://www.softpedia.com/get/System/Back-Up-and-Recovery/dsfok.shtml

 

And yehhhh, now my VMDK convert Powershell is working……

ConvertTo-MvmcVirtualHardDisk -SourceLiteralPath E:VMDKvm1.vmdk -VhdType DynamicHardDisk -VhdFormat vhdx -DestinationLiteralPath e:vmdkvhdx

Now to see if can get the VM running in Hyper-V. Fingers crossed as will be kewl to add to my SfB lab

 

Happy Skype’ing

 

Steve

SSL 証明書の更新に関するアナウンス

$
0
0

本日、Azure サービスの証明書更新に関する通知がお客様に送信されました。

この通知は日本のお客様に対しても英語で送信されてしまい、すぐに対処が必要なのか、なにかリスクが発生しているのかなどがわかりにくく、ご心配をおかけしてしまっているのではないかと思います。

今回の通知について、米国本社の下記ブログ記事をベースに、日本語版のご案内を以下に用意いたしました。以下の内容にて、ご覧いただけますと幸いです。

参考: Azure TLS Certificates changes

はじめに

安全に Azure サービスをご利用いただけるよう、多くのサービスでは SSL/TLS によるアクセスを提供しています。このときに使用されるサーバー証明書は、あらかじめ定められた、マイクロソフトの中間証明機関から発行されています。証明機関の詳細は、Certificate Practice Statement (CPS) で情報公開されております。

一部のお客様は、Certificate Pinning とよばれる仕組みをアプリケーションに実装しています。Certificate Pinning とは、(非常に簡易的に説明しますと) クライアント アプリケーション側で、信頼する証明書をあらかじめ指定する仕組みです。

証明機関のリプレースなどで証明書が変更される場合、Certificate Pinning を実装しているアプリケーションでは、新しい証明機関が信頼されるようアップデートが必要になります。仮にアプリケーションの更新が行われないと、新しく変わったサーバー証明書の信頼性を確認できず、SSL/TLS 通信ができません。このようなことを防ぐために、マイクロソフトはいきなり証明機関を変更するのではなく、あらかじめアナウンスして、実際の変更までに準備期間を設けています。

現在、Azure サービス向けの証明書を発行している中間証明機関は、2018 年 5 月に期限を迎えます。これを受けて、マイクロソフトはあらたな中間証明機関を 2016 年 7 月に公開しており、Azure のサービスではこの新たな中間証明機関によって署名された証明書の利用を 2017 年 7 月 27 日から開始します。Certificate Pinning などの仕組みで、証明書が変わることで影響を受けるようなアプリケーションをご利用いただいている場合は、2017 年 7 月 27 日までに、あらたな証明書に対応できるよう、アプリケーションのアップデートが必要になります。

変更点

現在、Azure サービス向けの TLS 証明書は、以下の中間証明機関から発行されています。

CN 拇印
Microsoft IT SSL SHA2 97 ef f3 02 86 77 89 4b dd 4f 9a c5 3f 78 9b ee 5d f4 ad 86
Microsoft IT SSL SHA2 94 8e 16 52 58 62 40 d4 53 28 7a b6 9c ae b8 f2 f4 f0 21 17

2017 年 7 月 27 日以降、Azure サービス向けの TLS 証明書は、上記の 2 つに加えて、以下の 4 つの中間証明機関からも発行されるようになります。

CN 拇印
Microsoft IT TLS CA 1 41 7e 22 50 37 fb fa a4 f9 57 61 d5 ae 72 9e 1a ea 7e 3a 42
Microsoft IT TLS CA 2 54 d9 d2 02 39 08 0c 32 31 6e d9 ff 98 0a 48 98 8f 4a df 2d
Microsoft IT TLS CA 4 8a 38 75 5d 09 96 82 3f e8 fa 31 16 a2 77 ce 44 6e ac 4e 99
Microsoft IT TLS CA 5 ad 89 8a c7 3d f3 33 eb 60 ac 1f 5f c6 c4 b2 21 9d db 79 b7

* CA#3 がスキップされておりますが、無視してください。

発行者の情報は、以下のようになります (CN 以外に現行からの変更はありません)。

  • CN = Microsoft IT TLS CA 1 [2,4,5]
  • OU = Microsoft IT
  • O = Microsoft Corporation
  • L = Redmond
  • S = Washington
  • C = US

また、以下のとおり CRL (証明書失効リスト) のエンドポイントと、OCSP のエンドポイントも変更されますので、これらのエンドポイントにアクセスできることを確認してください。

従来の CRL 配布ポイント http://cdp1.public-trust.com/CRL/Omniroot2025.crl
新しい CRL 配布ポイント http://crl3.digicert.com/Omniroot2025.crl
OCSP http://ocsp.digicert.com

この情報は、Certificate Practice Statement (CPS) でも公開されております。

この変更による影響

ほとんどのお客様には、この変更による影響はありませんが、以下のような場合には、影響を受ける可能性があります。

  1. 上述したような Certificate Pinning の仕組みで、信頼する証明書を管理するアプリケーションをご利用の場合。
  2. 新しい CRL 配布ポイントや OCSP へのアクセスができないような、ファイアウォールやプロキシーのルールが設定されている場合。

Certificate Pinning の影響を受ける可能性があるかは、以下のような方法で確認ができます。

  1. アプリケーションのソースコードに、現在の中間証明書の拇印である “‎97 ef f3 02 86 77 89 4b dd 4f 9a c5 3f 78 9b ee 5d f4 ad 86”“94 8e 16 52 58 62 40 d4 53 28 7a b6 9c ae b8 f2 f4 f0 21 17” を指定している箇所がないかを確認する。ソースコードにこれらが含まれている場合、これらの証明書しか信頼しないというコードが組み込まれている可能性があります。
  2. もしサードパーティから購入したアプリケーションの場合は、アプリケーションの開発ベンダーに確認する。

もし Certificate Pinning の影響を受ける可能性がある場合、新しい中間証明機関を信頼するようにアップデートが必要です。

同様に、CRL 配布ポイントや OCSP へのアクセスができない場合は、それらにアクセスできるように、ファイアウォールやプロキシー サーバーの設定変更が必要です。

FAQ

現在の中間証明機関の利用はいつ停止しますか?
現在の中間証明機関から発行された証明書は、2018 年 5 月 7 日に有効期限を迎えます。それ以降は、古い証明書を信頼する必要はありません。
全ての Azure サービスで、2017 年 7 月 27 日に同時に証明書が更新されますか?
いいえ、証明書の更新は段階的に、サービスによって異なるタイミングで行われます。すぐにすべてのサービスで証明書が更新されるわけではありません。

クラウド ソリューション プロバイダー プログラムを利用して収益を拡大する 3 つの方法【6/8 更新】

$
0
0

(この記事は 2017  年 4 月 19 日にMicrosoft Partner Network blog に掲載された記事 3 Ways the Cloud Solution Provider Program Can Lead to Higher Revenue  の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

ご存じのようにクラウド市場は年々拡大し、2020 年までに 5,000 億ドルを超える見通しです

とは言え、まだ多くのパートナー様には、この需要増加を好機としてビジネスに活かす準備が整っていないのが現状です。しかも、クラウドそのものを販売するだけでは、マイクロソフトとパートナーの皆様が目指しているようなビジネス成長を実現するには至りません。

マネージド サービスや IP 開発などのビジネス モデルは、収益の大部分を占めるようになる可能性を秘めています。では、そのレベルに到達するにはどうすればよいのでしょうか。幸いにも、こういった収益性の高いビジネス モデルには、いくつかの方法で今すぐ取りかかることができます。

私がよくお勧めするのは、マネージド サービス モデルの構築に役立つクラウド ソリューション プロバイダー プログラム (CSP) です。CSP は突き詰めればライセンス モデルなのですが、これまでご利用いただいていた他のプログラムでは得られないような、3 つの重要なビジネス チャンスを運んできてくれます。

 

チャンス 1: タッチポイントが増える

CSP と他のライセンス モデルを比べてみると、1 つ際立って異なる点があります。請求サイクルが 1 か月単位なので、お客様とのタッチポイントが増えるのです。これは大きなメリットになります。と言うのも、タッチ ポイントが増えれば、お客様との距離を縮め、より強固な信頼関係を築いて、信頼できるアドバイザーとしての地位を確立できます。その結果、従来のリセールやプロジェクト サービスのビジネス モデルにとどまらず、収益性の高いマネージド ビジネス モデルに移行できるチャンスが広がるのです。

 

 

チャンス 2: サポートを通して、ニーズの変化を見抜ける

直接モデルでも間接モデルでも、常時サポートのサービスを提供しているパートナー様は、お客様にとって欠かせない存在となっているでしょう。ワークロードを管理し、システム停止やセキュリティ違反に対処することで、たちまち、お客様の日常業務を支える盟友になれるのです。このレベルのサービスを提供していると、お客様が抱える課題が自ずと見えてきます。課題を突き止められれば、CSP を通じて提供するライセンスに加え、付加価値サービスも提供できるようになります。

 

 

チャンス 3: 経常的な収益が生まれる

月次の請求サイクルでは、お客様は IT コストを予測できるので予算内に収めやすくなり、一方のパートナー様も経常収益が得られるようになります。そこにデータのバックアップ、セキュリティ、評価といった付加価値サービスを組み合わせれば、いよいよ 5,000 億ドルのクラウド市場に参入する準備は完了です。

また、さらに利益率の高いクラウド ビジネス モデルを構築したい場合は、CSP を活用してサブスクリプションと自社のマネージド サービスをパッケージ化し、カスタマー エクスペリエンスを完全に管理することができます。毎月マネージド サービスの経常利益を獲得できるようになれば、ライセンス販売による顧客内シェアを拡大できるだけでなく、お客様のサブスクリプション期間中に経常収益源を新たに拡大していくことも可能です。

 

 

マネージド サービスには、パートナー様の収益を大幅に拡大する可能性が詰まっています。マイクロソフトはクラウド ソリューション プロバイダー プログラムを通じて、マネージド サービスを構築するために必要なリソースをご提供します。

詳しくは、クラウド ソリューション プロバイダー プログラムのページをご確認ください。ライセンスを販売するだけでなく、お客様と実りある関係を築き、収益性の高いクラウド プラクティスを構築するための方法をご紹介しています。

クラウド ソリューション プロバイダー プログラムを活用することで、クラウド プラクティスの収益性はどのように向上しましたか。ぜひ皆様のご意見をお聞かせください。

 

 

 

 

 

Размеры фигур или как вместить раковину в маленькую ванную комнату

$
0
0

Создавая схему офиса или разрабатывая электрическую сеть для логистического склада экзотических фруктов, нужно точно учитывать реальные размеры объектов. Чтобы ни один сантиметр не был упущен, поможет окно «Размер и положение» программы Visio.

Открыть это окно можно на вкладке Вид – в группе Показать выберите пункт Области задач, а затем — пункт Размер и положение. Чтобы увидеть размеры конкретной фигуры, выделите ее.

Высота и ширина фигур из реального мира

Представим задачу – вам нужно с комфортом разместить четверых сотрудников в одном из кабинетов вашего офиса. Вы знаете точные размеры комнаты и письменных столов и хотите посмотреть, достаточно ли места для четырех столов, или одному сотруднику нужно будет работать из дома, экономя рабочее пространство и другие ресурсы компании. Размер поверхности каждого стола 0,8 * 1,5 метра, ширина комнаты – 2,5 метра, длина – 3.

Откроем набор элементов «Офисная мебель». О том, где искать этот набор, вы можете узнать из нашей статьи. В нашем случае мы воспользовались готовым шаблоном «План рабочих мест», который можно найти на начальной странице Visio.

Выберем подходящий стол из набора и перетащим его на лист. По умолчанию Visio устанавливает размеры стола 0,965*1,734 метра. Вместо того, чтобы пытаться вручную поймать точный размер фигуры, перетаскивая маркеры размера, мы откроем окно «Размер и положение» и введем точные значения.

  1. Выделим фигуру стола на странице документа.
  2. В окне Размер и положение выделим значение в поле Ширина и введем 80 см, а затем нажмите клавишу ВВОД.
  3. Выделим значение в поле Высота и заменим его на значение 150 см.

Обратите внимание – нужно писать 150 см, а не использовать точку или запятую (1,5 м), иначе Visio выведет ошибку.

Теперь фигура точно соответствует размеру настоящего стола.

Продолжаем начатую работу и копируем стол с реальными размерами в комнату. С трудом уместив 3 стола, мы понимаем, что нам нужно отправить двоих сотрудников работать из дома, или же ставить столы меньшего размера, хотя из рисунка очевидно – в этом кабинете с комфортом можно разместить только одного человека, двоим уже будет тесно.

Думаем про удаленное рабочее место и инструменты для эффективной работы из дома – Microsoft Office.

Установка значений с помощью формул

Еще один удобный способ вводить значения в окне «Размер и положение» – это вводить математические выражения, к которым мы привыкли в Excel:

  • Сложение (+)
  • Вычитание (-)
  • Умножение (*)
  • Деление (/)

Чтобы посчитать конечный результат и применить его к фигуре, нажмите клавишу ВВОД.

Координаты и положение булавки

Чтобы точно и ровно разметить объекты на схеме, можно задавать их координаты. Значения в полях X и Y показывают расстояние от начала координат схемы до положения булавки на выбранной фигуре.

Как правило, начало координат находится в левом нижнем углу страницы, но в некоторых шаблонах оно может располагаться в других местах. Чтобы найти начало координат, выделите фигуру и введите в поля X и Y значение 0 (ноль). Фигура переместится так, что ее положение булавки будет находиться непосредственно в начале координат.

Положение булавки определяет не только то, как значения X и Y влияют на положение фигуры, но и точку, вокруг которой вращается фигура. Например, если фигуру нужно вращать вокруг угла, а не вокруг центра, с помощью раскрывающегося списка в поле Положение булавки можно указать этот угол.

При чем тут раковина и ванная комната, спросите вы? Точно таким же образом можно рассчитать удобное положение любых элементов комнаты в нужном пространстве. Практически для каждого случая в Visio есть готовые наборы и шаблоны.

Secure Your Workload, Secure Your Cloud Journey

$
0
0

Secure Your Workload, Secure Your Cloud Journey

With cloud technology growing exponentially, it can be challenging to identify threats before they arise. The challenge exists not in the security of the cloud itself, but in policies and technologies for security and control of the technology.

Clouds are secure. The question lies with how you manage and control your cloud.

Learn from industry leaders, Datapipe and Microsoft, on how companies can secure business critical workloads in Microsoft Azure.

Secure your seat. Register Now.

Date: 22 June (Thursday)
Time: 2:00pm to 4:00pm
Venue: Microsoft Experience Center, 15/F, Cyberport 2, 100 Cyberport Road, Hong Kong

> Attend & Get a Free Azure Readiness Assessment by Datapipe* <

*By clicking the registration button, you will directed to a third-party website. The collected personal information in this event will be used by the organizers, our affiliates, agents, and/or vendors for administrating this campaign. If you would like to have more information about Microsoft Hong Kong data privacy policy, please go to http://www.microsoft.com/hk/privacy.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Office 製品の UserVoice のご案内

$
0
0

こんにちは、Office サポート チームです。
今回は UserVoice のご利用についてご案内いたします。

マイクロソフトでは、Office製品をはじめとするマイクロソフトにて提供している製品、サービスに関するユーザー様からのご要望を UserVoice というサイトにて集約しています。
この UserVoice を通じて集められたユーザー様からのご要望を今後の製品開発時に活かすことで、より実際の運用に適した製品、サービスの提供を目指しています。

このような機能があればもっと使いやすくなる以前あった便利だった機能を再び追加して欲しいなどの実際に運用する上で出てきたアイデアがございましたら、各製品、サービスごとに開設しております以下の UserVoice サイトをご利用ください。

また、既存のアイデアに “Vote” を行うことでそのアイデアに対して賛同していただくことも可能です。

 

[各 Office 製品の UserVoice のURL]
Excel
https://excel.uservoice.com/

Word
https://word.uservoice.com/

PowerPoint
https://powerpoint.uservoice.com/

Access
https://access.uservoice.com/

OneNote
https://onenote.uservoice.com/

Visio
https://visio.uservoice.com/

 

Office 365 製品の UserVoice については以下のページで別途ご案内させていただいております。
UserVoice の登録方法について

https://answers.microsoft.com/wiki/a5081ef3-83f1-46b4-8032-37416f3925f2

 

[UserVoice のご利用に際して]
これらのサイトから登録したアイデアは開発部門、もしくは開発部門に近い部門に寄せられるものとなるため、一部を除き英語での登録が必要となります。
以下の Bing 翻訳などをお使いいただき、英語にてアイデアの登録を行ってくださいますようお願いいたします。

Bing 翻訳
https://www.bing.com/translator/

また、ご要望をお寄せいただく際には「xxが使いにくい」「xxが出来ない」といった内容 (例 1) より、「xxが出来た方が使いやすい」「xxという機能があればxxが実現出来る」といったような内容 (例 2) で登録いただくことで、より具体的な改善に繋がるアイデアとして認識される可能性が高くなります。なお、以下の例の英文は Bing 翻訳を活用したものとなります

例 1)
Excel のツリーマップが使いにくい。
Excel tree maps are hard to use.

例 2)
Excel のツリーマップの各要素のサイズを変更できるようにしてほしい。
Want to be able to customize the size of each element in the Excel tree map.

 

[スマイル マークからのフィードバック]
UserVoice 以外にも、Office 製品では以下のスマイル マークから弊社へフィードバックいただくことも可能です。
こちらの機能をご利用いただきますと、日本語によるフィードバックも可能ですので、ぜひご活用ください。

 

注意事項
本情報の内容 (添付文書、リンク先などを含む) は作成日時点でのものであり、予告なく変更される場合があります


5 areas in which Visual Studio 2017 Enterprise extends Visual Studio 2017 Professional

$
0
0

giles

By Giles Davies, Visual Studio Technical Specialist at Microsoft

This is an updated version of the 6 differences between Visual Studio 2015 Enterprise and Visual Studio 2015 Professional article which now includes new functionality added in 2017.

Visual Studio 2017 Enterprise provides a wide range of functionality and therefore I’ll group the capabilities into five areas:

  1. Code quality
  2. Testing
  3. Build and release
  4. Xamarin
  5. Subscription benefits

Let’s take a look at what Enterprise provides in each of those areas, over and above what Professional offers. If you want to look for specifically what’s new in Enterprise in 2017 (i.e. not in 2015 Enterprise) then the short list is Live Unit Testing, Live Architectural Dependencies, Packages, Build and Release Pipelines, Xamarin and subscription benefits. These are all introduced below.

Code Quality

There are a range of tools to help developers write the highest quality code, and that build on the core unit testing capabilities in Professional:

IntelliTest

IntelliTest was introduced in 2015 and analyses your source code and then creates unit tests to achieve 100% code coverage for each path through that code.

That means that you can get a lot more of your code covered by unit tests for less effort, helping you to add unit tests to code that doesn’t have any and making it easy to keep on top of creating unit tests for new code. It doesn’t mean that you’ll never write a unit test again, but I’d consider this a means of getting the core unit tests generated for you, allowing you to concentrate on specific tests and scenarios based on your domain knowledge.

You can tailor it to allow exceptions, override object creation and much more.

It will work with MS Test (including MS Test v2 now in Visual Studio 2017), NUnit, xUnit and any other testing framework that provides an adapter for use in Visual Studio. It is currently limited to C#.

Here’s a 35 min video walking through IntelliTest.

Live Unit Testing (new for 2017)

What’s the earliest point that your code can be tested with unit tests? How about as you’re typing? That’s what Live Unit Testing provides as well as line level coverage and test outcome all shown in the editor as you work. It also works whilst debugging.

Read about it or watch a 10 min video.

Fakes

Microsoft Fakes has been around for a while but it’s perhaps an overlooked Enterprise capability that allows you to isolate parts of your overall solution to help with unit testing. Fakes splits into:

  • Stubs, which allow you to mock parts of your application (e.g. a component or service) to make unit testing easier or possible. Read more here.
  • Shims, which allow you to divert calls outside your application to your own implementation. An example might be to shim system calls to get the date in order to test for specific scenarios (e.g. a leap year). Read more here.

Code Maps

Code Maps allows you to understand and explore your source code visually.

Why is that useful? It allows you to build your understanding of the relationships between different aspects of your code without needing to read the source. Code Maps allows you to drill from assemblies down to namespaces, classes, methods and down to the underlying source, as well as filter out those elements you’re not interested in, such as test code or system assemblies.

You can add your own annotations and groupings, rearrange the diagrams and share them with others, via email, saved as diagrams or directly within Visual Studio. A Professional user can view CodeMaps but not author them.

I see this as useful for people new to a project, or when you’re going to be making changes to code that you aren’t familiar with or perhaps can’t remember.

One often overlooked capability is that you can also debug using Code Maps and this can really help in not having to keep track in your head of where you are in the code base whilst debugging.

Code Maps work for .NET and C/C++. For a demo here’s a 9 min overview video and some more documentation.

Live Architectural Dependencies (revised for 2017)

Architecture layer diagrams have been in Visual Studio for quite a while, but they’ve now been reengineered to take advantage of Roslyn and thereby make them much more effective. You now get live, as you type code, warnings and errors, allowing you to enforce architectural rules across your codebase.

Here’s how I think of the benefits; it’s quite possible to have code that compiles, passes unit tests, passes functional tests and is in use in production, but still exhibits poor architecture such as incorrect dependencies between components/tiers/services etc.. This can result in issues in the future. Live Architecture Dependency allows you to stop this happening at the earliest possible time: as the problematic code is being typed.

From a licensing perspective this requires an Enterprise license to author, but any edition to pick up and enforce the rules.

10 min Live Architectural Dependency video.

IntellITrace

IntelliTrace is a historical or post-mortem debugging technology and essentially helps to address the “I can’t reproduce it” problems, typically across dev and test environments. It was introduced in 2010, so it’s not new, and can be used across dev, test and production environments.

A simple example of its use is:

A tester finds a bug in the test environment. The tester repros the bug with IntelliTrace collection enabled, and the bug raised then includes the IntelliTest log file. That log file includes the calls and events from the test environment.

The developer opens the bug and the IntellITrace log file in Visual Studio Enterprise, and can view the exceptions raised, and for each of those exceptions see the stack trace. The developer can choose to debug any of those exceptions, in which case Visual Studio will go into debug mode with the execution halted at the line of code that threw the chosen exception. The developer can then step both forward and backwards through the source code to understand what happened.

Key advantage – the developer doesn’t need to work out how to reproduce the conditions to replicate the bug in order to then start understanding how to fix it. IntelliTrace allows the developer to debug as if attached for debug, even though this takes place later and on a different environment. For a problem where it’s environment based (e.g. can repro in test but not in dev) this can save a lot of time.

This requires .NET 2.0 or higher C# or VB.NET and works in ASP.NET, Microsoft Azure, Windows Forms, WCF, WPF, Windows Workflow, SharePoint and 64-bit apps.

You don’t need Enterprise to collect IntelliTrace data, but you do need Enterprise to use it.

There’s a lot more to it, here’s the MSDN documentation and a 12 min overview video.

Build and Release (new for 2017)

Whilst the continuous integration and continuous deployment capabilities of Team Foundation Server and Visual Studio Team Services have been around prior to 2017, there are two related areas that Visual Studio 2017 Enterprise gives you access to without additional licensing:

Packages. Package management provides a private NuGet, NPM or Maven repository for publishing and sharing code. Read about package management. This has been added as an extension and requires additional licenses unless you have Enterprise, in which case it’s included.

Build and release pipelines. The way that build and release is licensed has changed to a model that licenses concurrency. You get one pipeline free, and if you want more than one build and/or release to be running at the same time then you need to purchase pipelines. Here’s a good explanation about pipelines but the key point here is that each Visual Studio 2017 Enterprise license gives you one extra pipeline for both TFS and VSTS.

Testing Tools

To summarise simply, you get all the testing tools in Enterprise. In other words:

  • Test case management
  • Manual testing
  • Exploratory testing
  • Automated functional testing
  • Load and performance testing

Enterprise provides test case management, manual and exploratory testing by including Microsoft Test Professional as part of the Enterprise edition. I won’t cover that now but you can find out more here.

What’s unique to Enterprise are automated functional testing and load and performance testing.

The Visual Studio automated functional testing tool is Coded UI. As the name suggests this creates automated tests (i.e. including automated verification) and records them as code – either C# or VB.NET. You can record them as you perform actions, or create tests by reusing test methods at a code level. You can also promote manual tests to create the automated test and then add verifications.

Coded UI allows you to build regression suites that drive the UI of the application under test (web and thick clients), and to run those regression tests remotely, such as on a test environment and even as part of the release management capability discussed above. Note that executing a CodedUI test remotely doesn’t require an Enterprise edition, so other users can run them.

For more info on CodedUI here are the docs.

Load and Performance Testing has been around for a long time in Visual Studio, and has evolved over the years. The core capabilities are the same; create a scenario that tests performance using a certain number of virtual users running a set of tests over a given time. You can factor in network conditions (e.g. 10% of the users are on a very poor network connection, what’s their experience?) and collect system performance counter information (CPU, memory, disk I/O and much more):

Here’s a walkthrough of creating and running a load test.

The latest changes have included the ability to choose to use Azure to generate the load i.e. you don’t need to find the hardware and set up the test rig. That’s without making any changes to the load test, just a radio button choice between the cloud and on-premise. Using the cloud means you can also choose where the load is coming from using the Azure data centers:

More details on cloud load testing and a 10 min video.

Xamarin (new in 2017)

Visual Studio 2017 (all editions) include the Xamarin development tools for creating native cross-platform mobile apps with C#. Visual Studio 2017 Enterprise adds some extra capabilities around Xamarin:

  • Embedded Assemblies: protects embedded native code.
  • Live Inspection (Preview). Allows you to programmatically interact with your live running app in real time on devices and emulators. This enables rapid design iteration and experimentation without needing to compile, build and deploy each time that changes are made.
  • Xamarin Profiler (Preview). Profiler is a full-featured profiling tool made specifically for Xamarin apps, helping developers identify memory and performance problems.
  • Xamarin Test Recorder. Record automated tests for your app.
  • 25% discount off Xamarin Test Cloud: this allows you to deploy your app and run automated tests against a range of real devices (over 2500 phones and tablets currently).

Visual Studio Subscriptions (MSDN) (revised for 2017)

Last but not least are the extra benefits (over Professional) that you get from Enterprise in Visual Studio Subscriptions (what used to be called MSDN). I’d highlight:

  • More Azure credit (currently £115 per month with Enterprise) so you can use more Azure services for free each month, such as running virtual machines.
  • An unrestricted set (i.e. a full subscription) of Pluralsight courses for a year (rather then the curated list previously provided).
  • Redgate ReadyRoll. This is an extension that provides continuous deployment support for your databases in VSTS.
  • PowerBI Pro.
  • Office 365 developer subscription for 25 users.
  • Office Professional production license.
  • Dev and Test downloads for SharePoint, Exchange, Dynamics, Office production use and PowerBI.
  • 2 collections of 10 courses of Microsoft e-learning.
  • 4 support incidents.

More details in the subscription editions overview.

Hopefully this gives you a flavour of the differences and if you’re in the position of either deciding which edition to get, or having become entitled to Enterprise from an upgrade, then you’ll have a better idea of the key additional capabilities.

Grow your business with Office 365 Advanced Security and Compliance

$
0
0

by Natee Pretikul, Senior Partner Marketing Manager, Office 365 Security and Compliance

Cybersecurity is a complex problem, from the volume of attacks to increasingly sophisticated hackers. Organizations don’t want to be in the headlines, as the impact to business is significant. Over the last year, many partners have started to develop and offer security-related services with Office 365 Advanced Security and Compliance. They’re addressing customers’ cybersecurity pain points, and helping enterprise clients of all sizes become more secure.

Our quantitative research shows that partners with a security practice expect to see their security business double by 2019 (MDC, Office 365 Solution Selling, April 2017). That’s a huge increase in business. You could see a similar impact to your bottom line with Office 365 Advanced Security and Compliance workloads.

With increasing security needs from customers, many partners that haven’t offered security services may wonder how to get started. Here are my suggestions.

Increase your knowledge with the latest resources

Lack of security talent in the market is a well-known issue. When you offer security-related services, your team should have great capabilities to support clients and understand product usage scenarios.

Microsoft is here to help. Check out the latest Office 365 Advanced Security and Compliance training videos at Partner University (access requires signing in as a Microsoft partner), with topics such as:

  • Fundamentals of building an Office 365 security practice
  • Perform security assessments with Office 365 Secure Score
  • How partners can help secure a customer’s Office 365 Tenant

At the end of the training, take the Technical Skills Assessment for Office 365 Security Partner Training to test what you’ve learned.

Learn from other partners’ success

The easiest way to get started is to learn from what other partners are doing. Many partners use a security assessment to raise security awareness, following up with a proof-of-concept (especially with Advanced Threat Protection), consulting and recommendations on next steps, driving deployment, and providing on-going services.

Staying secured is an ever-changing journey, which translates into on-going customer demand. Managed services and project-based services make up more than half of cloud IT security revenue (MDC, Office 365 Solution Selling, April 2017). See examples of this trend in these case studies:

  • Learn how Wortell uses Secure Score to drive deeper security conversations.
  • Discover how Olive & Goose has built a successful business targeting the Office 365 security and compliance space.
  • Find out how Peters & Associates uses Microsoft security and compliance solutions as a key component in the transformation of its IT consulting practice.
  • Learn how DynTek uses Advanced eDiscovery and other Office 365 capabilities to drive business.
  • Explore how Planet Technologies helps government customers adopt Advanced Threat Protection for price/performance advantage.

Drive security assessment workshops with customers

Most customers are reactive when it comes to security measurement, and you can help customers take a more proactive approach, beginning with a security assessment. As you can see in the examples above, other partners have found that conducting workshops is an excellent starting point. To help you get started quickly, we developed the Office 365 Secure Score Assessment IPKit for Microsoft partners.

The IPKit provides you with eight documents that help you effectively leverage Office 365 Secure Score to assess a customer’s Office 365 tenant, then provide recommendations to help address their security concerns.

Join the Office 365 Security Partner Yammer group and click Files to access and download the IPKit.

Stay informed

The New Intune and Conditional Access Admin Consoles are GA

$
0
0

There are a handful of topics that consistently come up whenever I meet with our customers and partners – and one of the most common has to do with how to balance productivity for end users with the need for security and control of company data.  The tension between these two needs is the stage upon which an even bigger challenge constantly looms:  Every IT team on earth being asked to do more with less at a time when technology keeps accelerating and the landscape of their own industry shifts beneath their feet.

The request I get in these meetings is very clear and consistent:  We need efficient solutions that make it easier to manage and control growing complexity; can you help us reduce the complexity we are dealing with?

This is where we bring in the good news:  Managing Intune and Conditional Access together with Azure AD just got a lot easier for our rapidly growing community of IT Professionals.  As of today, we have reached two important milestones for Microsoft Intune and for EMS Conditional Access capabilities:  Both new admin experiences are now Generally Available in the Azure portal!

Here’s how Intune’s redesign helps your organization

Intune’s move to the Azure portal is, in technical terms, a really big deal.  Not only did the Intune console change, but all of the components of the EMS console experience have now come together.  The process of migrating capabilities into the new portal was an incredible opportunity to reimagine the entire admin experience from the ground up – and what we are shipping today is an expression of our unique vision for mobility management shaped by needs of our over 45K unique paying customers.

I love the progress we’ve made here because Intune on Azure is great for our existing customers because they can now manage all Intune MAM and MDM capabilities in one consolidated admin experience, and they can leverage all of Azure AD seamlessly within one experience.  Awesome.

There is actually a whole lot more going on “behind the scenes” of the new administrative experience.  Not only have the administrative experiences converged, but we also converged Intune and Azure Active Directory onto a common architecture and platform.  Converging the architectures dramatically simplifies the work we do to support it, the work you do to use it, and it enables some incredible end-to-end scenarios across Identity and Enterprise Mobility Management.

Here are the 3 things you need to know about Intune on Azure:

  1. It’s built to leverage Azure’s hyper scale
    The Azure platform provides huge increases in elasticity and reliability for Intune, and it provides the foundation for nearly unlimited scale. The new admin experience will also run on any browser on any device form-factor. Now you can manage Intune from anywhere – even from your phone!
    The redesigned architecture and new console bring nearly unlimited scale to the service.  We currently have customers that are rapidly growing to 100,000s of devices in a single tenant.  No problem!  One customers has shared that they associated a sophisticated policy to ~200,000 users – and what took hours in the past was done in less than 3 minutes.   Now, because this is built into the Azure console, you get all the rich role-based administration for delegation of authority.
  1. It’s optimized for cross-EMS workflows
    With Intune’s move to Azure and the Azure Portal, we now share a console experience with other core EMS services like Azure Active Directory and Azure Information Protection. Having the collective power of these services living side-by-side makes them more effective and easier to manage across identity and access management, MDM and MAM, and information protection workloads.
    For example:  If you’ve just finished creating a set of conditional access policies to control access to data using Intune in the same portal environment, you’re now just a click away from adding additional app protection policies that ensure that your data is protected after it’s been accessed and is in use on mobile devices.
    The Intune transition to Azure also delivers deep integration with Azure Active Directory groups, which can represent both users and devices as native, dynamically targeted groups that are fully federated with an organization’s on-premises Active Directory.
  1. You can simplify, automate, and integrate management with Microsoft Graph
    Built on the Microsoft Graph API, the new Intune experience also opens the door for broader systems integration and automation.  This means that our customers can now simplify, automate and integrate workflows across Intune and the other services they are using however they see fit. For more information about what you can do with this, I really recommend this post.  Microsoft Graph API capabilities are currently in preview; expect a GA announcement for this functionality in the coming quarter.

If you haven’t tried Intune on Azure, we invite you to jump into this new experience with us.  To check it out for yourself, log into the Microsoft Azure portal right now.  We’re always listening and learning from your feedback, and we want to hear what you think!  Since we put this into preview in December there have been more than 100k paying and trial tenants provisioned!

Conditional Access – the new admin experience in the Azure portal

The new conditional access admin experience is also Generally Available today.  Conditional access in Azure brings rich capabilities across Azure Active Directory and Intune together in one unified console. We built this functionality after getting requests for more integration across workloads and fewer consoles.  The experience we’re delivering today does exactly that.

Organizations everywhere face the challenge of enabling users on an ever-expanding array of mobile devices, while the data they are tasked with protecting is moving outside of their network perimeter to cloud services – and all of this happens while the severity and sophistication of attacks are dramatically accelerating.  IT teams need a way to quantify the risks around the identity, device, and app being used to access corporate data while also taking into consideration the physical location – and then grant or block access to corporate apps/data based upon a holistic view of risk across these four vectors.  This is how you win.

Conditional access allows you to do this and ensure that only appropriately authenticated and validated users, from the compliant devices,  from approved apps, and under the right conditions have access to your company’s data.  The functionality at work here is technologically incredible, but it’s not always obvious how granular and powerful these controls really are. The new conditional access experience on Azure now makes the power of this technology crystal clear by showcasing the deep controls you have at every level in one consolidated view:

Now you can easily step through a consolidated flow that allows you to set granular policies that define access at the user, device, app and location levels.  Over the last 6 months, as I have shown this integrated experience to 100s of customers, the most common comment has been:  “Now I completely see what Microsoft has been talking about how Identity management/protection has needed to work with Enterprise Mobility Management to protect our data.”   Microsoft’s Intelligent Security Graph is also integrated here, delivering a dynamic risk based assessment into the conditional access decision.

You can also control access to resources based on a user’s sign-in risk via the vast data in. Once your policies are set, users operating under the right conditions are granted real-time access to apps and data – however, as conditions change, intelligent controls kick in to make sure that your data stays secure.  These controls include:

  • Challenging a user with MFA to prove that they are who they say they are.
  • Prompting the user to enroll their device in Intune.
  • Guiding the user to make adjustments to their device to meet your org’s security requirements
  • Blocking access all together or even wiping a device.
  • Granting different access privileges when using a native app (Word) vs. a web app (Word Online)

We believe Microsoft is uniquely positioned to deliver solutions that are this comprehensive and sophisticated yet remain simple to operate.  With EMS, these types of functionalities are possible because we’re building them together, from the ground up, to deliver on our commitment for secure and mobile productivity.

You can access the new conditional access console in the menu within both the Intune and Azure AD blades.  To see this functionality in action, check out this Endpoint Zone episode.

What’s Next

Our commitment to ongoing innovation means we never stop listening, shipping and reaching for what’s next.  Looking ahead, we’ll continue to release new features and enhancements at a steady pace throughout the year.  From this point forward, all new Intune and conditional access features will be delivered in the new portal, so keep an eye out.

Also:  Don’t hesitate to let us know what you think; our dialog with customers is our most valuable development input.

One last note:  This is a really significant day for all of us.  I am so pleased with the work that has been done here at Microsoft on the architecture and administrative experiences.  I’m happy for the team and what has been accomplished.  I am so pleased with the feedback that has come in from so many customers about the richness and vibrancy of the new admin experience as well as how performant the services are.  And, at the risk of sounding redundant, I’m happy to hear how much this has simplified your work while delivering incredible new, unique value such as the integrated Conditional Access.

 

 

Windows Server 2016: Run your applications on any platform

$
0
0

The thing about your digital transformation is that it’s a process, not an event. You can already see some benefits, but the ultimate payoff may be years down the road. In the meantime, you’ve got a wide variety of workloads running in different clouds, on different hosts, and on bare metal. That can create challenges when it comes to security, app development, VDI experiences, and disaster recovery.

Windows Server 2016 was designed specifically with these challenges in mind. In a new white paper, “Windows Server 2016: Run Workloads for Any Platform,” we’ve rounded up all of the guest-level benefits of running this OS in hybrid environments. Windows Server 2016 is designed to run both traditional and cloud-native workloads equally well on-premises or in the cloud, and can help resolve many of the issues that come with deploying workloads in hybrid and cloud environments:

  • Built-in security for virtual machines: Many security concerns are common across all platforms, they exist no matter where you run them. Windows Server 2016 has protections against malware, ransomware, and credential theft built-in at the OS level. You can improve security for any app significantly, just by migrating that app to Windows Server 2016.
  • Easier, more efficient app development: Whether you’re writing apps for virtual machines or for the cloud, Windows Server 2016 helps developers deliver apps to market faster and with greater impact by supporting innovations like containers, Server Core, and Nano Server.
  • Improved VDI experiences: With Windows Server 2016 as your work space, you can provide personal-session desktops in the cloud, support modern graphics and 3D apps, and eliminate MultiPoint Server licensing costs.
  • More cost-effective disaster recovery: Storage Replica in Windows Server 2016 offers both synchronous and asynchronous replication of data between servers and clusters to keep all nodes in sync, regardless of whether they are located on-premises or in the cloud.

The white paper also provides an overview of capabilities that become available when you run your workloads on Windows Server 2016 on Azure. Download it today and learn more about the benefits of running Windows Server 2016, no matter where you run it.

Todo lo que necesitan saber sobre Data Science en los eventos virtuales de Microsoft

$
0
0

Por: Federico Rodríguez, Coordinador Editorial, News Center Microsoft Latinoamérica

Sacar el máximo provecho de Big Data, esa podría ser la definición más simple (y precisa) de la Ciencia de los Datos (conocida en inglés como Data Science). Con la gran cantidad de datos que generamos y almacenamos, es necesario contar con las habilidades necesarias para aprender a interpretarlos y obtener de ellos información de valor que pueda beneficiar a la empresa.

Para Microsoft, impulsar a las personas a que alcancen todo su potencial es parte central de su misión y por esta razón, creó una serie de eventos abiertos en línea, a manera de seminarios, para profundizar en la Ciencia de los Datos y los beneficios y retos que lleva consigo.

Durante 5 semanas, expertos de Microsoft fueron anfitriones de estos eventos virtuales en los que se habló de los fundamentos de la Ciencia de los Datos; cómo aprovechar Cortana Intelligence para empezar a predecir los estándares del futuro; cómo construir su primer experimento en Azure Machine Learning; cómo utilizar API de Microsoft Cognitive Services para construir componentes inteligentes en aplicaciones; cómo integrar Data Lakes (Lagos de Datos) en soluciones de inteligencia de negocios; aprender más sobre Bot Framework y cómputo conversacional; además de conocer experiencias de la vida real de un Científico de Datos en Latinoamérica.

Estos eventos virtuales se encuentran disponibles bajo demanda para que puedan entrar y verlos cuando quieran, desde donde quieran. En esta liga, podrán encontrar, en la parte inferior, todos los eventos virtuales de Data Science, además de otros webinars sobre otros temas de interés.

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>