Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

SPFx DEV Environment offline installation

$
0
0

This post is about how to create a Development Environment for SharePoint Framework (SPFx) if your box does not have access to the Internet.

Exposition

Microsoft recently published the Feature Pack 2 for SharePoint 2016. The biggest shot in this update is that it makes the long waited SharePoint Framework (SPFx) available for the on-prem customers, which was available for SharePoint Online users since 23rd February 2017. The installation of the framework is a relatively straightforward process, you just have to install Node.JS, Visual Code, Yeoman Generator, and the SharePoint Framework specific toolkit.

The only problem is, that the instructions available currently (like this one from Vesa Juvonen) shows only how to install that if your box has access to the Internet. Of course, since if you are working with SharePoint Online, you're not likely to work on a box without access to the net, but if you are a developer in a locked-down corporate environment, you might have a bit of issue. Let's see how to get around that.

If you want to skip the drama, just jump to the Falling Action section of this post.

Climax

Act 1

To expand the action plan of the SPFx Toolkit installation on a clean machine, this is what you have to do:

  1. Install the latest version of Node.JS.
  2. In a PowerShell window install the Yeoman Generator.
    • npm install -g yo
  3. Still in a PowerShell window install Gulp (for example) for previewing and testing your solution.
    • npm install -g gulp
  4. Still in the PowerShell install the Microsoft SharePoint Framework toolkit
    • npm install -g @microsoft/generator-sharepoint
  5. Finally install the Visual Studio Code (or whatever dev environment you want to use).

My first thought was that I'm just going to use Fiddler to see what is being downloaded when I run the npm installer. Oh, boy, was I wrong... While Fiddler is a great tool most of the times, certain modules are not using the INetProxy configuration, so obviously it won't be aware that there is something trying to sniff the wire, it will simply get around that, so I had to find something else.

Act 2

After a few hours of search I've found a nice little project by Glen R. Goodwin (co-authored by Dan Bornstein and a few others), called NPMBox, which is an "npm addon utility for creating and installing from an archive file of an npm install, including dependencies. This lets you create a "box" of an installable package and move it to an offline system that will only install from that box."

The tool really does work. It indeed is capable of fetching all the packages, and dependencies (almost, but let's not run that fast...), and put it into a nice little file that you can copy onto your locked-down environment and install them offline.

The installation of the Yeoman and Gulp packages went without problem, but the installation of the SharePoint Framework toolkit failed with a naughty error message:

Installing @microsoft/generator-sharepoint...
An error occurred while installing @microsoft/generator-sharepoint.
@microsoft/generator-sharepoint was not installed.

Act 3

This is where I had to come back to the original question, how can I sniff what npm is doing on the network? Fortunately Luke Brendt figured this out already. Well... Not exactly this, but how to use Node.JS behind a proxy.

npm config set strict-ssl false
npm config set registry "http://registry.npmjs.org/"
npm --proxy http://username:password@cacheaddress.com.br:80 install packagename

Given that Fiddler is nothing else, but a proxy, I could use the instructions in his blog entry to configure my machine to use Fiddler as a man in the middle. Awesome... Next installation attempt revealed that the NPMBox package is missing a single component. (http://registry.npmjs.org/postcss-modules-extract-imports/-/postcss-modules-extract-imports-1.1.0.tgz).

How to put this into the package. Well... That's not an easy job, but why would I bother with that, when Fiddler has a nice AutoResponder function? The only thing needed to be done is saving this file somewhere and creating a nice rule pointing to this file. Something like this:

 

Falling Action

To summarize, you have two options to install the bits required on a SharePoint Framework DEV Environment:

 

Method 1: Capture the whole installation network traffic and play it back with Fiddler.

  • This is probably the easier way to do it, although it does not make it possible to automate the installation, should you need to create installation packages.
    1. Configure NPM to use HTTP traffic.
npm config set strict-ssl false
npm config set registry "http://registry.npmjs.org/"
    1. Start Fiddler and start capturing.
    2. Install your package using Fiddler as a proxy.
npm --proxy http://username:password@cacheaddress.com.br:80 install packagename
    1. Save all packets from Fiddler.
    2. Transfer the .SAZ file to your locked down box.
    3. Start Fiddler on the locked down box.
    4. Import the .SAZ file into the Autoresponse rules.
    5. Configure the locked down box as you did with the previous one (step 1 in this section).
    6. Run the installation using Fiddler as a proxy, just as you did with the previous one (step 3 in this section.)

 

Method 2: Use NPMBox to package most of the stuff and use Fiddler to replay only the single file that is missing from the package.

  • Personally, I favor this option as this is cleaner and could be packaged in the future, when the guys fix that tiny little problem of the missing file.
    1. Install NPMBox.
npm install -g npmbox
    1. Create an npmbox packate for Yeoman, Gulp and the SharePoint Framework Toolkit.
npmbox --proxy='http://127.0.0.1:8888' -t c:TempYo yo
npmbox --proxy='http://127.0.0.1:8888' -t c:TempGulp gulp
npmbox --proxy='http://127.0.0.1:8888' -t c:TempSPFx @Microsoft/generator-sharepoint
    1. Configure NPM to use HTTP traffic.
npm config set strict-ssl false
npm config set registry "http://registry.npmjs.org/"
    1. Re-install the SharePoint Framework toolkit using the NPMBox package.
npmunbox --proxy='127.0.0.1:8888' -g SPFx
    1. Take note of the URL of the extra package that is loaded by the installer. It's going to be a version of the postcss-modules-extract-imports-1.1.0.tgz module. (Note: this step is required only until the NPMBox guys fix the issue with the missing packet.)
    2. Save the module as a file.
    3. Copy the NPMBox packages and the extra file to the locked down box.
    4. Install Fiddler and configure an AutoResponder rule for the missing packet with the URL in step 5. (Note: this step is required only until the NPMBox guys fix the issue with the missing packet.)
    5. Use NPMUnbox to install the offline packages.
npmunbox --proxy='127.0.0.1:8888' -g Yo
npmunbox --proxy='127.0.0.1:8888' -g Gulp
npmunbox --proxy='127.0.0.1:8888' -g SPFx

 

Dénouement

Now that you have a way to deploy a SharePoint Framework (SPFx) environment in any locked down environment, you can test, try, use and spread the word that this is really some good stuff for SharePoint let it be Online or On-prem installation.

 


Update 1709 for Configuration Manager Technical Preview Branch – Available Now!

$
0
0

Hello everyone! We are happy to let you know that update 1709 for the Technical Preview Branch of System Center Configuration Manager has been released. Technical Preview Branch releases give you an opportunity to try out new Configuration Manager features in a test environment before they are made generally available. This month's new preview features include:

  • Co-management - Co-management is a solution where Windows 10 devices with Fall Creators Update can be concurrently managed by Configuration Manager and Intune, as well as joined to Active Directory (AD) and Azure Active Directory (Azure AD) to provide a way for you to modernize Windows 10 management over time. You can read more about co-management here.

This release also includes the following improvement for customers using System Center Configuration Manager connected with Microsoft Intune to manage mobile devices:

  • Improved VPN Profile Experience in Configuration Manager Console - VPN profile settings are now filtered according to platform. When you create new VPN profiles, each supported platform will contain only the settings appropriate for the platform. Existing VPN profiles are not affected. You can read more about this change here.

Update 1709 for Technical Preview Branch is available in the Configuration Manager console. For new installations please use the 1703 baseline version of Configuration Manager Technical Preview Branch available on TechNet Evaluation Center.

We would love to hear your thoughts about the latest Technical Preview! To provide feedback or report any issues with the functionality included in this Technical Preview, please use Connect. If there's a new feature or enhancement you want us to consider for future updates, please use the Configuration Manager UserVoice site.

Thanks,

The System Center Configuration Manager team

Configuration Manager Resources:

Documentation for System Center Configuration Manager Technical Previews

Try the System Center Configuration Manager Technical Preview Branch

Documentation for System Center Configuration Manager

System Center Configuration Manager Forums

System Center Configuration Manager Support

Download the Configuration Manager Support Center

Improving experience for VPN profiles for ConfigMgr and Hybrid MDM

$
0
0

Starting in the System Center Configuration Manager 1709 Technical Preview, we're making it easier to determine which VPN profile settings are supported on each platform - like the changes we've made to compliance policies and configuration items. When creating a new VPN profile, you'll first choose the platform it applies to, and then all the settings in the following wizard pages will apply to the selected platform. This will make it much easier to avoid creating an invalid profile - which will in turn reduce the need to troubleshoot broken VPN profiles or to contact support.

We started down this path several releases ago when we split the Windows 10 VPN workflow from the all platforms workflow. Now, we've split up all the supported platforms so they'll each have their own path.

In addition to splitting out the workflows by platform, we've also combined the Configuration Manager client and hybrid mobile device management (MDM) workflows for Windows 10, since both management methods now support the same settings. For Windows 8.1, we've clearly marked the settings supported by Configuration Manager only, and we've retained the import option.

Finally, we've removed the Automatic VPN page, since all the settings configured by this page were deprecated by their respective platforms, making this page obsolete.

In this blog post, we'd like to answer some questions you may have.

Why did you make this change?

The main driver for this change is to prevent customers from inadvertently creating invalid VPN profiles. Prior to this change, all VPN settings for all platforms supported by Configuration Manager were exposed in the all platforms workflow. Some settings were labeled by platform (specifically, per-app VPN for iOS), but beyond this it was to tell which settings applied to which platform; also, the Automatic VPN page was still there even after it had become obsolete.

Customers and support staff would then ask why a specific configuration wasn't working correctly. In most cases, they had created a profile with settings that were not supported by the platform. Sometimes the setting was supported for one of the targeted platforms, but not another, and it was impossible to tell from the user experience. Finding out that the configuration the customer wanted to use wasn't supported was disappointing and frustrating for everyone involved. These changes are designed to prevent these issues.

In earlier releases, we made similar changes in compliance policies and configuration items for the same reason. VPN is the first of the company resource access profiles to get this treatment, and while it was mainly designed to improve the experience for MDM profiles, the updates benefit devices managed by the Configuration Manager client as well - particularly because the Windows 8.1 settings are clearly set apart from all the mobile platforms now.

What about my existing profiles?

We understand that many of our customers use VPN profiles for multiple platforms, and by this point, you might be concerned. However, you don't need to worry about your existing profiles; one of our goals was to ensure that all existing profiles continue to work as they did before the change. When you upgrade, you will still see the same properties pages, and no changes will be made to the profiles themselves. All new profiles will use the new experience, but all existing profiles will still use the previous experience.

Let us know what you think!

If you're eager to have similar changes applied to other profile types, please leave a request on UserVoice:

If you still have questions, or are experiencing issues, reach out to your Microsoft contact or support team.

You can also find more information about this change here.

 

Thanks,

Tyler Castaldo

Program Manager, Enterprise Mobility

Invest in yourself. Invest in your technical readiness

$
0
0

Throughout my career as an IT professional, I have witnessed many different technology trends within the enterprise and as a result I have always had to adapt to ensure I was able to effectively plan/deploy/operate them. I had to attend professional training courses, hands-on labs, write certification exams, participate in user groups and a myriad of other techniques to ensure my skills are kept evergreen and enabled me to perform effectively. One thing is clear, working in IT there will always be change in the type of technology being used and to remain competitive having skills in that technology is paramount. This is more true than ever with the advent of cloud technologies and the rapid pace at which those technologies are being updated.

Recently, a new tool has come into my life that made this readiness journey easier and more accessible than ever. Rather than attempting to source online training courses, hands on labs, videos, on-demand sessions from conferences, etc, there is now a tool that unifies those sources in a single place. That tool is Microsoft Tech Academy.


With Microsoft Tech Academy, I have access to a centralized repository of readiness content and can track my progress as I complete the content. This is valuable in not only ensuring I complete each piece of content, but also to report my own readiness status back to my management (and add to LinkedIn). At the time of this writing, there are 23 "pathways" I can choose from ranging from Office 365 ProPlus:Deployment and Management to OneDrive for Business to Microsoft Graph to Office 365 Administration and more. Inside a pathway you will find a description and a learning outcome, that will give you expectations of what you will learn and as a result what you will be able to do with the information you just learned.

Note: This content is automatically sourced across 16 sources such as Microsoft Mechanics, Microsoft Technical Documentation, Microsoft Virtual Academy, EdX,etc.

Example of a pathway:


On the left side of the screen, clicking on My Workspace
displays a collection of pathways I have bookmarked, along with a "study schedule" that allows me to define deadlines and the appropriate courses in a to-do window. A very nice way to stay organized when going through the training content.

Example workspace:


On the left side clicking on Dashboard displays my progress into three sections: Pathways, Content, Schedules. Pathways indicating how many pathways have been completed, in progress and not started. Content indicating individual content items that have been completed versus bookmarked and Schedules indicating how many items I have added to my study schedule and if those have been completed or not started. As items are completed, the dashboard will dynamically change.

Example of dashboard:


Back to pathways, within the pathway I selected I can view all the content in a single place. As I complete the content whether it's a session from Inspire or a TechNet hands-on lab, clicking Mark as Complete under the item will mark it as completed. At the top of the page a filter can be performed to see which items are completed versus outstanding (to do).

Example content:


At the top of the pathway, I can also see my overall progress of that pathway:


To back up the technical content, Microsoft Tech Academy is part of Microsoft Technical Communities – over 51 technical communities that span across various Microsoft products and services. These communities give you the opportunity to ask questions and interact with your peers across the industry and with Microsoft employees as well. In addition, product announcements and news about new features and services are also occurring in these community forums. Here is an excellent resource that will help you with getting started in the communities.


Lastly, as part of the Microsoft Technical Communities, I can browse blogs each from each product team and subscribe to my favorites so I will be notified of future announcements. Excellent way to stay up to date!


Wrapping up, Microsoft Tech Academy and Microsoft Technical Communities is a fantastic tool to add to your toolbelt to help you stay on top of your technical readiness, enjoy!

Mixed Reality パートナー プログラムを国内で開始 &認定パートナー 3 社を発表!! 【10/7 更新】

$
0
0

マイクロソフトのパートナー企業向け年次総会「Japan Partner Conference 2017 Tokyo」で発表された Mixed Reality パートナープログラムを、国内のパートナー企業を対象に 2017 年 10 月より開始します!

現在、国内の多くの法人のお客様において Mixed Reality(複合現実)を使ったデジタルトランスフォーメーションに関心をいただいており、実証実験や導入検討が進められています。Mixed Reality ソリューションを実現するには、これまでの法人向けのアプリケーションやインフラストラクチャを開発するスキルに加えて、これまで以上に創造性に富んだ Mixed Reality ソリューションの設計や開発のスキルが必要となります。

そこで、Mixed Reality パートナープログラムでは、より多くのパートナー企業が、確かな開発スキルや知識に基づいて、法人のお客様に HoloLens や Windows Mixed Reality 対応デバイスを用いたソリューションを提供できるよう、日本マイクロソフトおよびマイクロソフト コーポレーションが、パートナー企業に対してトレーニングや技術情報をご提供し、実際に法人のお客様との実証実験を通してスキルを高めていただきます。

日本マイクロソフトは、Mixed Reality パートナープログラムの本格展開に先立ち、本プログラムにパイロット参画いただいた、株式会社博報堂様、株式会社wise様、株式会社ネクストスケープ様を国内初の Microsoft Mixed Reality パートナーとして発表させていただきました。動画も合わせてぜひご覧ください。

 

Mixed Reality パートナープログラム の詳細はこちらから

 

 

Friday with International Community Update – Progress in each language (Sept. 2017)

$
0
0

Hello, Wiki Ninjas!
Today is Friday with International Community Update.

The end of September is as follows:

The topic of this month:

  • No order change.
  • Polish and British posted many articles. They are likely to catch up with Vietnamese soon.

Thank you!!

Tomoaki Yoshizawa (yottun8)
Blog: blog.yottun8.com
Facebook: Tomoaki Yoshizawa
twitter: @yottun8
TechNet Profile: Tomoaki Yoshizawa

Image uploading with Azure Functions node.js and Angular 4

$
0
0

In this blog post, I'd like to explain how to upload image to the Azure Blob storage from Angular 4 SPA application and Azure Functions HttpTrigger Node. In this experiment, I use Mac Book Pro with Azure Functions CLI with core branch. Which means Azure Functions 2.0 with local debugging one.

Binary Uploading Strategy

You can choose two strategies for uploading image. One is multipart/form or base64 encoding.  In this usecase, I recommend to use base64 encoding.  If you choose the multipart/form strategy, you need to parse multipart. However, every multipart parser written for experss not for azure functions. (e.g. busboy). Even if you use azure-function-express, you can't do it until now.  The Azure Functions req object doesn't have some methods for the multipart parsers. If you want to go  multipart/form for Azure Functions Node.js, you need to write multipart parser by yourself.

Other solution is to use C#. I try to write multipart parser for Azure Functions Node.  Until then, I recommend the base64 encoding strategy.  Keep it simple.

Architecture

I'm using Angular 4.x. as a SPA. From this SPA, I'll send image to HttpTrigger, then using blob trigger, I'll upload the image file to a container.

 

SPA  -> Azure Functions (HttpTrigger with Blob output bindings) -> Storage Account

 

Something like this.

 

Azure Functions Settings for SPA

If you access form SPA to Azure Functions, you might encounter this error.

Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:4200' is therefore not allowed access. The response had HTTP status code 404.

This is the CORS problem. Javascript code from browser doesn't access the outside domain resource. For avoiding this issue on your local debugging enviornment, you need to add CORS setting on your local.settings.json. It requires server side configuration.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "",
    "AzureWebJobsDashboard": ""
  },
  "Host": {
    "LocalHttpPort": 7071,
    "CORS": "*"
  }
}

Now you are ready to write code!

Image Upload using Angular 4

The code is very easy.  I just do  1. base64 encoding using FileReader: readAsDataURL() method. Once I upload file on an input. Then  2. the Angular 4 invoke upload() method. Then "load" event happens, then it create an json object as a body and send it to the server.

car-detail.component.ts

@Component({

 selector: 'app-root',

 templateUrl: './car-detail.component.html'

})

export class CarDetailComponent {

 title = 'Car Reviews';

 image = 'assets/noimage.jpg';

 car: Car;


 constructor(private http:Http) {

 this.car = new Car();

 this.car.name = "";

 this.car.company = "";

 this.car.description = "";

 this.car.image_url = "assets/noimage.jpg";

 this.car.state = "pending";

 }

 executeUpload(base64encoded: string, filename: string) {

 let headers = new Headers();

 let options = new RequestOptions({

 headers: headers

 });

 let data = {filename: filename, data: base64encoded }

 this.http.post(encodeURI('http://localhost:7071/api/FileUploadNode/' + filename), data, options)

 .subscribe(

 data => console.log(data),

 error => console.log(error)

 );

 this.car.image_url = encodeURI("https://something.blob.core.windows.net/outcontainer/" + filename); 

 
 this.image = base64encoded; // 3. 

 console.log("File encoded");

 }

 upload(list: any) {

 if (list.length <= 0) { return; }

 let f = list[0];

 let reader = new FileReader();

 let self = this;

 reader.addEventListener("load", function() {  // 2.

 let base64encoded = reader.result;

 self.executeUpload(base64encoded, f.name);

 }, false);

 if (f) {

 reader.readAsDataURL(f); // 1.

 }

 }

}

 

After sending image data to the server, we need to update the image on the screen. Although I have a bindings to Car instance, I didn't directly map the image to the image. Instead I use image property then 3. I pass the base64 image data to the image property. We can pass the URL of the blob storage, however, it is asynchronous operations, we need to wait until the process has been done.

car-detail.component.html

 

 <md-card-header>

 <div md-card-avatar class="example-header-image"></div>

 <md-card-title>{{car.name}}</md-card-title>

 <md-card-subtitle>{{car.company}}</md-card-subtitle>

 </md-card-header>

 <img md-card-image src="{{image}}" alt="Some car image">

 <md-card-content>

 

Decode base64 image with Azure Functions

Now you can get an image encoded in base64 via HttpTrigger of Azure Functions. Let's write a code for decode.  To encode the image base64 image is like this according to the RFC2397 . Just decode these.

example of base64 image. It includes some headers.

[10/7/17 4:58:58 AM]   data: 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABagAAAG0CAYAAADXf8CiAAAMFWlDQ1BJQ0MgUHJvZmlsZQAASImVVwdYU8kWnltSCAktEAEpoTdBepUaqiAgHWyEJEAoAROCih1dVHDtIoIVXRVRdC2ALDZEsbAI2OsDFRVlXVzFhpo3KaDra9

And once you configure the output blob bindings, you all you need to do is

 

context.bindings.outputBlob = response.data;
Then an image is uploaded to your Storage Account. In this case, blob output binding's name is outputBlob.
index.js
function decodeBase64Image(context, data) {

 var matches = data.match(/^data:([A-Za-z-+/]+);base64,(.+)$/);

 var response = {}

 if (matches.length !== 3) {

 context.log("Error case");

 return;

 }

 response.type = matches[1];

 response.data = Buffer.from(matches[2], 'base64');

 return response;

}

module.exports = function (context, req) {

 context.log('JavaScript HTTP trigger function processed a request.');

 context.log(req.body);

 let response = decodeBase64Image(context, req.body.data);

 context.log('filename: ' + req.name);

 context.log('filetype: ' + response.type);

 context.bindings.outputBlob = response.data;

 context.res = {

 // status: 200, /* Defaults to 200 */

 body: "Uploaded " 

 };

 context.done();

};

 

However, we have some problem. The uploaded image has a random number name. I'd like to specify the filename.

 

Specify the name binding for images

We can configure binding data runtime. However it is only for C#.  We need to come up with other strategy. We can't configure the filename from our code. Instead, we can use the feature of the Azure Functions binding feature. You can see the route property. Once we accept the FileUpload/xxxxx url, Azure Functions pass the xxxxx to {filename}. On the output bindings, you can use {filename} as well.

Let's see the function.json 

{

 "bindings": [

 {

 "authLevel": "function",

 "type": "httpTrigger",

 "direction": "in",

 "name": "req",

 "route": "FileUploadNode/{filename}",

 "methods": [

 "post"

 ]

 },

 {

 "type": "http",

 "direction": "out",

 "name": "res"

 },

 {

 "type": "blob",

 "name": "outputBlob",

 "path": "outcontainer/{filename}",

 "connection": "carreviewstr_STORAGE",

 "direction": "out"

 }

 ],

 "disabled": false

}

Now, you can upload image from your Angular 4 apps to Azure Functions (Node.js).  I'll upload whole sample after finish the coding.

Enjoy coding with Azure Functions!

 

 

 

Resource

Collect Linux data using custom JSON data source

$
0
0

In order to collect custom JSON data with OMS log Analytics you should have at least OMS Agent for Linux V1.1.0-217. In this post, we will be installing the latest version.

For this blog I am using Ubuntu 16.04 LTS running in Microsoft Azure, but you can use any of the supported Linux Operating systems. You can check the supported Operating Systems from Supported Linux Operating Systems

Downloading and Installing OMS Agent for Linux

Open your workspace and go to Settings > Connected sources > Linux servers > and copy the "Download AND ONBOARD AGENT FOR LINUX"

LinuxComputers

run the command in your linux VM and make sure it has completed successfully.

AgentInstallSuccess

If you've seen Shell bundle exiting with code 0 then this means that the agent was installed successfully and after a few minutes you should see the agent in the connected Linux computers.

You can also confirm the VM is sending heartbeat with the below query in your log analytics workspace:

Heartbeat | where OSType == 'Linux'

Prepare your Script

In this example I've prepared a shell script that will collect some server details like Hostname, IPAddress, Memory..etc and it will print the output in JSON format. We will be capturing the output of the script by the OMS Agent.


# Get VM Hostname
hostname=`hostname` 2> /dev/null

# Get Linux Distribution
distro=`python -c 'import platform ; print platform.linux_distribution()[0] + " " +        platform.linux_distribution()[1]'` 2> /dev/null

# Get Server uptime
if [ -f "/proc/uptime" ]; then
uptime=`cat /proc/uptime`
uptime=${uptime%%.*}
seconds=$(( uptime%60 ))
minutes=$(( uptime/60%60 ))
hours=$(( uptime/60/60%24 ))
days=$(( uptime/60/60/24 ))
uptime="$days days, $hours hours, $minutes minutes, $seconds seconds"
else
uptime=""
fi

# Get VM private IP Address
IPAddress=`ip addr | grep 'state UP' -A2 | tail -n1 | awk '{print $2}' | cut -f1  -d'/'` 2> /dev/null

# Get VM Public IP Address
PublicIP=`wget http://ipecho.net/plain -O - -q ; echo` 2> /dev/null

# Get the number of CPUs
NumberOfCPUs=`grep -c ^processor /proc/cpuinfo` 2> /dev/null

# Get the Average CPU Load
CPULoad=`top -bn1 | grep load | awk '{printf "%.2fn", $(NF-2)}'` 2> /dev/null

# Get the total Memory
TotalMemoryMB=`grep MemTotal /proc/meminfo | awk '{print $2}'` 2> /dev/null

# Get percentage of memory in use
MemoryInUse=`free | grep Mem | awk '{print $3/$2 * 100.0}'` 2> /dev/null

# Get percentage of free memory
FreeMemory=`free | grep Mem | awk '{print $4/$2 * 100.0}'` 2> /dev/null

printf '{"hostname":"%s","distro":"%s","uptime":"%s","IPAddress":"%s","PublicIP":"%s","NumberOfCPUs":"%s","CPULoad":"%s","TotalMemoryMB":"%s","MemoryInUse":"%s","FreeMemory":"%s"}n' "$hostname" "$distro" "$uptime" "$IPAddress" "$PublicIP" "$NumberOfCPUs" "$CPULoad" "$TotalMemoryMB" "$MemoryInUse" "$FreeMemory"

In this example, I've placed my script in /var/www/html but feel free to change it to anywhere else accessible by the omsagent user. additionally, you can make the omsagent and omiusers group the owner of the script by running the below:

sudo chown omsagent:omiusers /var/www/html/serverdetails.sh

Make the script executable by running the below command:

chmod +x /var/www/html/serverdetails.sh

Configure Input plugin

The custom data sources can be simple scripts returning JSON such as curl or one of the FluentD Plugins, In our example we will be using exec Input Plugin.

Create a new configuration file serverdetails.conf in the OMS Agent additional configuration directory to store our configuration file.

nano /etc/opt/microsoft/omsagent/YorWorkspaceID/conf/omsagent.d/serverdetails.conf

In order to collect JSON data in Log Analytics, we will need the following

- We will need to add oms.api. to the start of a FluentD tag parameter in an input plugin. This will also be used when searching for your data in Log Analytics, For example, the custom tag
tag oms.api.serverdetails will have a record type of serverdetails_CL

- In the command type in your command to execute and return a JSON output, in this example we will call our script located on /var/www/html/serverdetails.sh.

- Format: should be json

- interval - is the interval to run the script.

Note: Make sure to change YourWorkspaceID path to reflect your actual workspace ID in Buffer_path parameter.



  type exec
  command 'bash /var/www/html/serverdetails.sh'
  format json
  tag oms.api.serverdetails
  run_interval 30s



  type out_oms_api
  log_level info

  buffer_chunk_limit 5m
  buffer_type file
  buffer_path /var/opt/microsoft/omsagent/YourWorkspaceID/state/out_oms_api_serverdetails*.buffer
  buffer_queue_limit 10
  flush_interval 20s
  retry_limit 10
  retry_wait 30s

Now that we've created our configuration file, we will need to change the ownership of this file to the omsagent user and omiusers group.

sudo chown omsagent:omiusers /etc/opt/microsoft/omsagent/YourWorkspaceID/conf/omsagent.d/serverdetails.conf

To confirm the ownership has changed type:

ls -l /etc/opt/microsoft/omsagent/YourWorkspaceID/conf/omsagent.d/

chownomsagent

Restart the OMS Agent

Restart the oms agent by running the following:

sudo /opt/microsoft/omsagent/bin/service_control restart

Once you restart the agent, the OMS agent should capture the new configuration file located in omsagent.d directory. To confirm you can check the omsagent log file located on

/etc/opt/microsoft/omsagent/YourWorkspaceID/conf/omsagent.d/omsagent.log

FilterAddedToOMS

If you've any errors it will also listed in the same log file, so back and check your configuration. If all of your configurations are correct, in a few minutes you should start seeing your data in Log Analytics.

Viewing your data in Log Analytics

The data will be collected in Log Analytics with a record type of the FluentD tag parameter. In our case it is

serverdetails_CL

QueryJSONLogs

Notes:
- Records may take a while to be converted into custom fields. Once they are created, you will be able to see them in the custom fields blade.
- All values returned by JSON has a data type of System.String so you cannot query something like

serverdetails_CL | where CPULoad_s > "50"

and you will get "Cannot compare values of types string and string. Try adding explicit casts". But you can search for it as a string value for example:

serverdetails_CL | where IPAddress contains "10.0.0"

Top Contributors Awards! October’2017 Week 1

$
0
0

Welcome back for another analysis of contributions to TechNet Wiki over the last week.

First up, the weekly leader board snapshot...

 

As always, here are the results of another weekly crawl over the updated articles feed.

 

Ninja Award Most Revisions Award
Who has made the most individual revisions

 

#1 M.Vignesh with 120 revisions.

 

#2 Kapil.Kumawat with 102 revisions.

 

#3 RajeeshMenoth with 51 revisions.

 

Just behind the winners but also worth a mention are:

 

#4 Richard Mueller with 48 revisions.

 

#5 PriyaranjanKS with 46 revisions.

 

#6 Ken Cenerelli with 44 revisions.

 

#7 Burak Ugur with 29 revisions.

 

#8 Anthony Duguid with 29 revisions.

 

#9 Peter Geelen with 27 revisions.

 

#10 Ousama EL HOR with 19 revisions.

 

 

Ninja Award Most Articles Updated Award
Who has updated the most articles

 

#1 M.Vignesh with 58 articles.

 

#2 Richard Mueller with 44 articles.

 

#3 RajeeshMenoth with 21 articles.

 

Just behind the winners but also worth a mention are:

 

#4 Ken Cenerelli with 18 articles.

 

#5 Carsten Siemens with 15 articles.

 

#6 Kapil.Kumawat with 14 articles.

 

#7 Peter Geelen with 13 articles.

 

#8 Anthony Duguid with 8 articles.

 

#9 PriyaranjanKS with 6 articles.

 

#10 M.Qassas with 4 articles.

 

 

Ninja Award Most Updated Article Award
Largest amount of updated content in a single article

 

The article to have the most change this week was USMT 10: How to Migrate Windows User, by GinaMelf

This week's reviser was Peter Geelen

 

 

Ninja Award Longest Article Award
Biggest article updated this week

 

This week's largest document to get some attention is SharePoint 2016 GUID of Features, by Waqas Sarwar(MVP)

This week's revisers were M.Vignesh, Kapil.Kumawat & Maruthachalam

 

 

Ninja Award Most Revised Article Award
Article with the most revisions in a week

 

This week's most fiddled with article is Build the Attacker's Playground, by Jessica Payne (MSFT). It was revised 15 times last week.

This week's revisers were M.Vignesh, Kapil.Kumawat, RajeeshMenoth, Jessica Payne (MSFT), Sabah Shariq, John Rodriguez (MSFT) & Peter Geelen

 

Ninja Award Most Popular Article Award
Collaboration is the name of the game!

 

The article to be updated by the most people this week is SSAS : Parent/Child dimension properties, by Ousama EL HOR

This week's revisers were M.Vignesh, Kapil.Kumawat, NTRao, Ousama EL HOR, RajeeshMenoth & Burak Ugur

 

Ninja Award Ninja Edit Award
A ninja needs lightning fast reactions!

 

Below is a list of this week's fastest ninja edits. That's an edit to an article after another person

 

Ninja Award Winner Summary
Let's celebrate our winners!

 

Below are a few statistics on this week's award winners.

Most Revisions Award Winner
The reviser is the winner of this category.

M.Vignesh

M.Vignesh has won 25 previous Top Contributor Awards. Most recent five shown below:

M.Vignesh has not yet had any interviews, featured articles or TechNet Guru medals (see below)

M.Vignesh's profile page

Most Articles Award Winner
The reviser is the winner of this category.

M.Vignesh

M.Vignesh is mentioned above.

Most Updated Article Award Winner
The author is the winner, as it is their article that has had the changes.

GinaMelf

This is the first Top Contributors award for GinaMelf on TechNet Wiki! Congratulations GinaMelf!

GinaMelf has not yet had any interviews, featured articles or TechNet Guru medals (see below)

GinaMelf's profile page

Longest Article Award Winner
The author is the winner, as it is their article that is so long!

Waqas Sarwar(MVP)

Waqas Sarwar(MCSE 2013) has been interviewed on TechNet Wiki!

Waqas Sarwar(MCSE 2013) has won 33 previous Top Contributor Awards. Most recent five shown below:

Waqas Sarwar(MCSE 2013) has TechNet Guru medals, for the following articles:

Waqas Sarwar(MCSE 2013) has not yet had any featured articles (see below)

Waqas Sarwar(MCSE 2013)'s profile page

Most Revised Article Winner
The author is the winner, as it is their article that has ben changed the most

Jessica Payne (MSFT)

This is the first Top Contributors award for Jessica Payne (MSFT) on TechNet Wiki! Congratulations Jessica Payne (MSFT)!

Jessica Payne (MSFT) has not yet had any interviews, featured articles or TechNet Guru medals (see below)

Jessica Payne (MSFT)'s profile page

Most Popular Article Winner
The author is the winner, as it is their article that has had the most attention.

Ousama EL HOR

This is the first Top Contributors award for Ousama EL HOR on TechNet Wiki! Congratulations Ousama EL HOR!

Ousama EL HOR has not yet had any interviews, featured articles or TechNet Guru medals (see below)

Ousama EL HOR's profile page

Ninja Edit Award Winner
The author is the reviser, for it is their hand that is quickest!

M.Vignesh

M.Vignesh is mentioned above.

 

 

Another great week from all in our community! Thank you all for so much great literature for us to read this week!
Please keep reading and contributing!

 

Best regards,
— Ninja [Kamlesh Kumar]

 

メンター プログラムの必要性を考える[10/8 更新]

$
0
0

(この記事は 2017 年 8 月 21 日にMicrosoft Partner Network blog に掲載された記事 Should You Have a Mentoring Program? の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

専門職として成功している人たちがどのように現在の地位を築いたのか調べていくと、多くの場合、そのキャリアの転換点にはメンターの存在があったことがわかります。多くの成功者が、先輩社員から新たな仕事のコツを学んだり、業界のソート リーダーからビジネスに関する新たなアイデアのヒントを得たりしているのです。優れたメンターとは、業界の現状を的確に把握したうえで、キャリアについて現実的なアドバイスを行い、仕事へのモチベーションを引き出してくれます。ただし、こうした関係を組織的かつビジネスに適した方法で構築するのは、簡単なことではありません。

自身のキャリアを伸ばしたい人にとって、メンター プログラムはきわめて効果的な方法ですが、それ以上に、企業にとってもメンタープログラムの構築は利益につながるということが明らかになってきています。この記事では、メンタリングを通じてビジネスに利益をもたらすうえでのポイントと、組織的かつビジネスに適したメンタープログラムを構築するにあたっての戦略をいくつかご提案していきます。

 

メンター プログラムの役割

Microsoft Inspire (英語) で行われた「Women in Technology Mentoring Circle Workshop」では、パネリストたちが自身のキャリア形成において、そして自身が作ったチームに対して、いかにメンタリングが役に立ったかを語りました。

 

「私がメンターにもらった最高のアドバイスは、『得意分野を極めれば成功への道は開かれる』というものです」

- マイクロソフト、シニア ビジネス戦略アナリスト、Aileen Hannah

 

メンター プログラムは社員個人にとって有益なだけでなく、社風作り、リーダーシップの向上、後継者育成、知識の継承などを大きく推進するうえでも非常に効果があります。また、企業の成長への道筋を整え、社員のモラルやモチベーションを高め、さらには離職率や欠勤率を下げる役割も果たします。生産性の向上やリーダーシップの開発などの点では、メンティーとメンターのどちらにとっても明らかなメリットがあります。

 

「私がメンターにもらった最高のアドバイスは、『必ず自分よりも賢い人を雇え』です」

- マイクロソフト、パートナー リクルート プログラム担当シニア ディレクター、Kati Quigley

 

 

メンター プログラムを成功させるには

企業の制度として正式にメンター プログラムを構築したい場合も、社員どうしのメンタリングを活性化させたいだけの場合も、社員間の関係性を築きチャンスを与えることは、企業にとって大きな価値があります。ここでは、メンター プログラムを成功させるためのポイントを 5 つご紹介します。

 

1.企業にどのような利益があるか明示する

社内のメンター プログラム構築に利益を見いだしている企業は増加し続けています。事実、Fortune 500 企業の 96% (英語) が、社内全体の社員教育と組織発展の一環として、何らかの形でメンター プログラムを実施しています。メンター プログラムには大きな効果がありますが、企業にとってどのような利益があるかを明確に提示できなければ、組織的に実施するのは困難でしょう。

 

2.メンティーの目標を設定する

専門職の人の多くはメンター プログラムの価値を理解していますが、助言を受ける側のメンティーとしても、目標を設定することで目に見える結果を出しやすくなります。社員間の交流の目的など、メンター プログラムを通じてどのような成果を得たいのかを、明確に設定するようにしましょう。希望を伝え合い、メンターから何を学びたいのかきちんと定めることで、メンター プログラムはより有意義なものになります。

 

3.メンターにどのような利益があるか伝える

ほとんどのメンターは、単に好意や人の役に立ちたいという思いから指導役を引き受けていますが、メンター プログラムで利益を得るのはメンティーだけではありません。メンターにも同様に、大きなメリットがあります。メンターは時間も責任感も求められるたいへんな役割ですが、どんなメリットがあるかをきちんと伝えることができれば、より積極的にメンターを引き受けてもらえるでしょう。

 

4.役割を定め、積極的な関与を得る

組織立ったメンター プログラムを構築するにあたっては、それぞれの関係者に徹底して関与してもらう必要が出てきます。メンター プログラムは一種の社会契約でありながら、明確な決まりを定めていなかったり、気軽に引き受けたりすることもよくあります。投資される者と企業の両者にとって真に価値のあるプログラムを構築するには、しっかりとした長期的な関与が必要です。メンターの役割とはどういうものか、目標とする参加者の数はどのくらいか、メンター プログラムの成功はどのような基準で判断するのか、といった問いへの答えは、人によって異なるでしょう。しかし、各自の役割や活発な参加の必要性を伝えることは、プログラムの成功に大いに役立ちます。

 

5.メンター プログラムを売り込む

構築したメンター プログラムを社内外で売り込むのは、プログラムの明暗を分ける重要なポイントです。こうしたプログラムがどんな人に最も需要があるか考え、プログラムの価値を最も理解してくれる人たちに売り込みましょう。メンター プログラムを立ち上げる際には、まずはワークショップなどを開いて、前述のようにメンターとメンティーが話し合い、お互いを知り、明確な目標を定める場を設けることをお勧めします。また、メンターに対するサポートを提示しなければ、快く引き受けてもらえない可能性もあるのでお気を付けください。メンター プログラムは、社員にとっても組織全体にとっても有益な、企業文化の楚として発展させる必要があります。

 

メンター プログラムに関するご意見やご経験がありましたら、ぜひマイクロソフト パートナー コミュニティ (英語) でシェアしてください。

 

 

 

 

Azure Stack @ Ignite!

$
0
0

Ignite in Orlando has already come and gone, and now you can watch many great sessions on Azure Stack right from YouTube!  Later they will be also published on Channel 9.  The advantage on the latter is that then you can also grab the session slides, in addition to the downloading the videos.  Where on YouTube you can only stream them.

Get ALL the Ignite 2017 sessions a aka.ms/Ignite/YouTube

Get the Azure Ignite 2017 sessions on the Azure Channel on YouTube at aka.ms/AzureStack/YouTube

Currently I also have the Ignite 2016 sessions at aka.ms/AzureStack/Ignite, but once the 2017 sessions are uploaded to channel 9, I'll update this link.  While many of the 2016 sessions are still relevant, I would always check the latest conference sessions first :). So for now, hit the YouTube channel.  Then stay tuned for the Ignite session uploads to Channel 9!

 

Site-Scoped Conditional Access Policies in SharePoint Online

$
0
0

In March 2017 we introduced device-based policies for SharePoint and OneDrive, enabling administrators to configure Tenant-level policies.

Device-based policies for SharePoint and OneDrive in help administrators ensure data on corporate resources is not leaked onto unmanaged devices such as non-domain joined or non-compliant devices by limiting access to content to the browser, preventing files from being taken offline or synchronized with OneDrive on unmanaged devices.

On September 1st, 2017 we’ve continued to evolve our conditional access investments to address the ever-changing security landscape and business needs by introducing new levels of granularity with conditional access that allow administrators to scope device-based policies at the site collection level.  In addition, this granular policy can be configured to allow users on unmanaged to edit Office Online documents in the browser.

In the demonstration above, the Tenant is configured with a permissive device access policy, allowing full access from unmanaged devices to include desktop apps, mobile apps, and browsers.  The Marketing site inherits the policy configured at the Tenant; however, the Legal site has a policy configured less permissive than that configured at the Tenant level.  In addition, members of the Marketing site, while limited to browser only access on unmanaged devices, can continue to edit content they have access to provide a seamless collaborative experience.

Configuring Policies

Once available in First Release Tenants site-scoped device-based access policies can be configured with SharePoint Online Management Shell.

Before you get started using PowerShell to manage SharePoint Online, make sure that the SharePoint Online Management Shell is installed and you have connected to SharePoint Online.

NOTE

The Tenant-level device-based policy must be configured to Full Access prior to configuring site-scoped policies.

  1. Connect-SPOService -Url https://<URL to your SPO admin center>
  2. $t2 = Get-SPOSite -Identity https://<Url to your SharePoint online>/sites/<name of site collection>
  3. Set-SPOSite -Identity $t2.Url -ConditionalAccessPolicy AllowLimitedAccess

Modern Service Management in the Intelligent Cloud

$
0
0

Executive Summary

Microsoft’s compelling Intelligent Cloud platform, coupled with our modern approach and perspective on IT Service Management, provides the answers to an often-asked question; Why doesn’t Microsoft have a dominant IT Service Management (ITSM) tool in the marketplace?

First, our view on IT Service Management is that traditional ITSM processes and practices haven’t kept pace or evolved over the years and simply do not support and embrace modern capabilities the cloud provides, nor enable the agility and deployment at speed and scale required for digital transformation.  More and more Microsoft customers are asking for our assistance to “Transform and Modernize IT”.

Secondly, our Intelligent Cloud, Intelligent Edge along with our AppSource marketplace partners do in fact provide a compelling, modern approach for Modern (and traditional) Service Management and IT Asset Management scenarios. All with the capabilities provided in the Intelligent Cloud as shared at IGNITE 2017 by our CEO Satya Nadella –

…to be multi device and multi sense.  You start at one device using speech, you may end in another device using ink.  All of the experiences and all of the infrastructure, applications, devices, are going to be infused with AI, natural language, speech, computer vision are all just going to be part of what we do and what we expect.”

The following details how this is accomplished.

The Intelligent Cloud & Intelligent Edge

The cloud is sometimes still viewed and managed through the lens of organizations who incorrectly think it is all disconnected and “siloed”.  Microsoft calls our strategy the “Intelligent Cloud” and “Intelligent Edge” for a reason – it assuredly is NOT based on siloed mailboxes, stove piped applications and individual virtual machines.

The design intent of the Microsoft Intelligent Cloud is to “empower organizations to unlock greater insights, transform teamwork, and securely enable innovative solutions”.  The paradigm of the Intelligent Cloud and Intelligent Edge is multi-device and multi-sense.  Everything highlighted in this article is or will eventually be closely unified.

For those in the IT organization, IT Service Management (ITSM) represents how IT manages, operates and transitions technology, designs services, and manages risk within the organization.  And more often than not, focused operationally and for infrastructure and not full stack applications.  Traditional ITSM hasn’t kept pace with rapid changes in modern applications and services, modern application development approaches, proliferation of cloud services, and automation.  This isn’t to say it won’t work. But simply put; you won’t achieve the outcomes and value as quickly from innovation and capabilities the cloud provides when managing them with a legacy mindset and approach.  Customers and colleagues in the ITSM industry often ask us;

  • “What impact does the Intelligent Cloud and Digital Transformation have on ITSM?”
  • “How does the Intelligent Cloud work with or enable IT Service Management?”
  • “What is meant by Modern Service Management?”

From purely an IT Service Management perspective, Microsoft has been continuously modernizing and evolving how technology is and should be managed.  This has driven updated practices and approaches in support of not only our Intelligent Cloud strategy, but in support of Digital Transformation. Transformation that businesses are keen to quickly achieve.  Lessons learned have then evolved our approach and guidance from traditional ITSM industry guidance into modern ITSM, also known as; Modern Service Management.

A senior Architect within our practice has suggested that Digital Transformation is the culmination of the following seven facts (augmented by this author);

  1. Business must innovate to compete (in their respective industries)
  2. Devices are inexpensive (think IoT)
  3. Compute is powerful and inexpensive (and easily automated)
  4. Storage is inexpensive (and easily automated)
  5. Internet (and network) connectivity is prevalent and inexpensive (and easily automated)
  6. Cloud enables transformational stuff to be at everyone’s fingertips e.g. garage developers now have access to Machine Learning and Artificial Intelligence thru the cloud
  7. Agile/DevOps makes it all happen faster (and easily automated)

Does your IT organization embrace and broker these facts?  Or stand squarely in the way of them?  This is a big change for IT organizations that often struggle to embrace this modern approach. Manually oriented, control-centric, and “technology agnostic” based practices give way to leaner, automated, self-directed improvements that reduce friction, manual efforts, failure demand, and IT inter-mediation -- resulting in greater business value.  We came up initially with an internal definition of Modern Service Management; A lens, intended to focus ITSM experts around the globe on the most important outcomes that evolve our customers from legacy, traditional IT models toward easier, more efficient, cost effective and agile service structures.  The following is just some of these traditional to modern positions and practices of Modern Service Management;

Modern Service Management like any change to the status quo is challenging.  "It ought to be remembered that there is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things… Because the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new. “​  - Niccolò Machiavelli

The above has been provided for context only.  There has been plenty of articles, blogs and videos from myself and colleagues. And covering this in detail is not the intention of this article.  For more information on Modern Service Management check out the reference links at the end of this article...

Enabling Traditional or Modern Service Management in the Intelligent Cloud

How then do we enable, evolve and transition from traditional industry-based ITSM to Modern Service Management?  It starts by rethinking the goals and objectives of “Service Management” as well as the platform it executes on. It also starts by understanding what changes when you consolidate and move applications and workloads from on-premise physical/virtual to true private or public cloud.

To enable and support both traditional and/or modern IT Service and IT Asset Management capabilities, how does one utilize the Microsoft Intelligent Cloud?  It’s simple; Start with the industry leading, recognized modern business application platform Dynamics 365.

At Ignite 2017, our CEO Satya Nadella, stated "modern organizations can't be captive to the old way of doing things" in terms of business applications. “Applications need to be more modular and automated than in the past”.

The Dynamics 365 Software as a Service (SaaS) Business Application platform, for instance, now offers modular apps that offer limited slices of focused functionality. Also discussed was the promise of "AI-first business applications”, which hinge on artificial intelligence for their core functionality.  Microsoft is using our own AI technology to handle online customer service and support inquiries, and other major tech companies are as well.

Dynamics 365 is “The next generation of intelligent business applications that help you empower your employees, engage customers, and optimize operations”. Complete solutions such as Customer Care and Field Service are available as part of the platform.  In addition, the platform is extended by customers and ISVs at a fraction of the cost of developing an application from the ground up.

Dynamics 365 provides the application platform functionality one expects in an industry recognized and industry leading platform;

  • Software as a Service (SaaS) and On-Premises deployment options
  • Wizard driven configuration virtually eliminating the need the development, or scripting for forms, workflow, business rules and business process flows, while allowing for advanced development if needed using non-proprietary development skills (Javascript, .Net)
  • Reporting services for specialized reports, built in user and system Charts and Dashboards from current data, dynamic Excel spreadsheets and pivot tables, and PowerBI for analytics
  • Microsoft AppSource - an Application marketplace for utility functionality as well as complete business vertical applications
  • Industry recognized and leading Customer Service Solution that incorporates Microsoft Bot framework and Artificial Intelligence
  • Responsive portals implemented, modified and extended without advanced web development skills yet flexibility to extend through advanced means if needed
  • Modern mobile clients on all major mobile platforms and modern application for Windows 10 that allow access to data and functionality similarly available through web experience.
  • Unified Service Desk client for integration to call distribution systems, transcription of voice from callers, search knowledge and measure sentiment as transcription takes place
  • Stage based business process flows, workflow and business rules that allow adherence to process steps and activites

The above results in the following capabilities from just the Dynamics 365 platform to support and enable Modern Service Management;

  • On-premises implementations with consistent functionality that runs in the cloud (Dynamics 365 SaaS receives semi-annual updates). And hybrid implementation options that can locate data in the cloud or on-premises as needed or required.
  • Reduced backlog of configuration changes to IT and business outcomes within the Dynamics 365 platform
  • Easy creation of net-new functionality while allowing you to keep your changes separate from those of Independent Software Vendors (ISVs) and Microsoft.
  • Automated intake through email, chat, portal, social and other interfaces and increased quality of intake through automation - Ability to apply the power of artificial intelligence
  • Reduced time to value for the configuration and use of self-service portal, including multiple, diverse types of portal identity providers, reducing need to have multiple portals for interaction with external associates, customers, etc.
  • Responsive mobile portal access for associates/customers
  • Remote approvals through client or portal interface, and through Microsoft PowerApps.

The story doesn't end with Dynamics 365.  The Intelligent Cloud includes Office 365, the most popular cloud based messaging and collaboration platform.  Office 365 is seamlessly integrated to Dynamics 365 as one would expect in the Intelligent Cloud.

Dynamics 365 and Office 365 are very close family members of the Intelligent Cloud.  Minimally, they share the same portal for administration, operations, licensing and support and share Azure Active Directory for identity.  Many of the services provided by Office 365 and Office 365 ProPlus are natively integrated to Dynamics 365.

Office 365 in fact extends the Dynamics 365 common data service. For example, if you setup meetings with a client or associate, to discuss a request, incident or issue, etc., the case/ticket entity in Dynamics 365 can be automatically linked to the Meeting entity in Office 365.  Any entities created beyond what is in Dynamics 365 and Office 365 are available in the common data service.  Office 365 working in tandem with Dynamics 365 provides;

  • Email, scheduling, task and contact management
  • Automatic SharePoint, Groups and Teams integration to Dynamics 365 which provide simple and easy collaboration and document management among users and non-users of Dynamics 365
  • Service side and client-side integration to Exchange and Outlook allowing clients to work on Dynamics 365 application within Outlook.
  • Microsoft Flow and PowerApps integrations to Dynamics 365 for end user “citizen” automation without the need of development skills

The combined capabilities of the above result in the following modern service management scenarios made possible with Office 365 and Dynamics 365;

  • Integrated groups and team collaboration with associates and resources that are not licensed users in Dynamics 365.  Examples include Release / SCRUM Teams, Problem Resolution Teams that have shorter time frames
  • Reduced cost of Dynamics attachment storage without sacrificing search capabilities.  Examples include storing screen shots, documents related to assets or CIs.
  • The power of Dynamics 365 available while you’re using Outlook on the desktop, web, or phone.   Examples include support resources responding through email to support requests or incidents and not having to go into a specific application
  • Synchronize not only email, but appointments, tasks and contacts as well support work actions and appointments with employees and associates
  • Ease of creation of After-Action reports in Microsoft Word from an individual record.
  • Dynamic Excel spreadsheets for synchronized, in most cases up to date reporting, data visualization and analytics.  Also provides easy means of importing data from other data sources.  Examples include Asset data import, Service and Application analytics.
  • Ability for regular users to automate their own work scenarios using Flow between Dynamics, their Office 365 account, Project, Visual Studio and many other services, without writing a single line of code.
  • Configure workflows between your favorite apps and services.  Examples include when a Release achieves a certain milestone, automatically create and link a project, or other service item available in Microsoft Flow, get notifications, synchronize files, collect data, and more.

Office 365 and Dynamics 365 provide collaboration and business application capabilities that can then take advantage of services provided in Azure through multiple means of integration.

Microsoft Azure is “a comprehensive set of cloud services that developers and IT professionals use to build, deploy, and manage applications through a global network of datacenters. Integrated tools, DevOps, and a marketplace support you in efficiently building anything from simple mobile apps to internet-scale solutions.

And while Azure provides many services and capabilities that appeal to the developer, there are many other services that do not require application development skills to utilize.  By no means a complete list, but just some of the services arguably relevant to Modern Service and Asset Management include;

  • Azure Logic Apps helps you simplify and implement scale-able integrations and workflows in the cloud between Microsoft and non-Microsoft cloud services without scripting or development skills required
  • Azure Operations Management Suite (also known as OMS) is a collection of management services including Log Analytics, Azure Automation, Backup, and Site Recovery
  • Azure Automation runbooks run in the Azure cloud and can access cloud resources or external resources that can be accessed from the cloud or on-premise.
  • Azure Application Insights as an extensible Application Performance Management (APM) service for web developers on multiple platforms.  Can monitor health of Dynamics 365 and other non-Microsoft applications.

The combined capabilities of the above result in the following scenarios made possible with Azure, Office 365 and Dynamics 365;

  • Save time by designing complex processes across Office 365, Dynamics 365, Azure and many other services using easy-to-understand design tools and utilize templates to get started quickly.  Examples would include integrating Incident Management to Azure Operations Management Suite Log Analytics to analyze failure patterns.
  • Seamlessly implement patterns and workflows that would otherwise be difficult to implement in code Example: Case/ticket in Dynamics 365 triggers and integrates a Project in Microsoft Project and in parallel an Enhancement project in Visual Studio Online
  • Customize your logic apps with your own custom APIs, code, and actions. Connect and synchronize disparate systems across on-premises and the cloud
  • Build off BizTalk server, API Management, Azure Functions, and Azure Service Bus with first-class integration support
  • Leverage the power of PowerShell to automate activities on-premise and in the cloud Example: Creating a new user in Active Directory and provisioning updated services to that user based on role across a number of services
  • Monitoring compliance of Windows quality update servicing or monitoring the health of services
  • Monitoring “externally” and “Internally” the health and performance of applications across technologies and services.   Example: Monitoring Dynamics 365 instances for health requires code snippet provided from Microsoft into forms that can then be leveraged by Azure Applications Insight

If you have hung with the article thus far—waiting for the actualization of either modern or traditional ITSM and ITAM data artifacts, workflows and processes—you have arrived…

Leveraging all the aforementioned Intelligent Cloud capabilities provided by Dynamics 365, Office 365 and Azure in this article, is an AppSource certified, Global ISV Solution from Provance, Inc.  The Provance ITSM & ITAM Solution simply extends Dynamics 365 with the entities (e.g. forms, rules, processes, metrics) and capabilities one expects to find in traditional built for purpose ITSM Solutions and those well known in the marketplace.

Now Modern (or contemporary) IT Service and Asset Management processes co-exist with other business applications, utilizing the power of the common data service, Dynamics 365, Office 365 and Azure.  Some of the features of the Provance ITSM Solution include;

  • Addition of eleven (11) Pink Verify™ certified processes including Incident, Problem, Change, Request, Release & Deployment, Knowledge, Service Continuity, Service Portfolio, Service Catalog, Service Level, and Event Management
  • IT Asset and Configuration Management Included
  • Integrations to System Center 2016 for existing cloud or on-premise implementation
    • Configuration Manager for Asset/Configuration Item management and reconciliation,
    • Operations Manager for monitoring, event generation
    • Orchestrator for automation.
  • Native Integration to Azure OMS for monitoring alerts
  • Automation framework that allows for Azure Automation, Powershell, Orchestrator and 3rd party automation integration at activity level.
  • Integration to Visual Studio Team Services or Team Foundation Services for Cloud or On-Premise implementation out of the box (can also be configured using Logic Apps/Flow)
  • Pre-Configured Self-Service Portal which is modified through Dynamics 365 and portal administration
  • Pre-configured roles, views, forms, business rules, business process flows, workflows
  • Templates for all ticket types
  • Configuration settings housed in data rather than solution for easier administration and promotion to production and data oriented administration federation to non-administrator
  • Third Party Notification for external suppliers (underpinning contracts)

The combined capabilities of the above result in the following scenarios made possible with the ITSM Solution accelerator, Azure, Office 365 and Dynamics 365;

  • Virtually out of the box ITSM and ITAM Solution that is ready for an adapt and adopt approach to implementation but is easily extended as needed for specific requirements within Dynamics 365
  • Easily import configuration and data from legacy ITSM solutions through simple Excel import (must follow business rules and relationships so no bad data in)
  • Ability to intake tickets, incidents and service requests through IM Chat (Skype for Business, CafeX), recorded and analyzed voice (in multiple languages) and self-service portal and email
  • Pre-configured deep ITSM and ITAM analytics available from Dynamics 365, PowerBI and Provance ITSM
  • Ability to manually or automatically apply templated tickets that pre-populate incidents, requests, changes and problems with values, relationships and activities including automation and approvals.

The "all up" Intelligent, Modern Service Management Solution

What results from all of the above is a natively integrated modern platform for both legacy ITSM processes and procedures, but more importantly a pathway to Modern Service Management patterns and practices that represent where IT organizations need to be in the future to remain relevant to their business.

There is a proliferation of ITSM solutions on the market.  We see more and more organizations desiring to consolidate and simplify technology, consolidate both internal and external service and support, automate intake and processing, increase levels of self-service and knowledge management, and utilize capabilities such as citizen automation, Release Pipeline, artificial intelligence, machine learning and Bot technology for improving employee/associate productivity and value.

This article is intended to simply provide a point of view on how Microsoft’s Intelligent Cloud and Intelligent Edge supports not only your overall strategic Digital Transformation, but also supports operational aspects such as IT Service and IT Asset Management.  It is important to note that these capabilities may not exist in all regions or government clouds.  Additionally, cloud based services may or may not be available for on-premises implementations.

For more information contact your Microsoft TAM, Account Executive, or Digital Advisor.   You can also contact Provance directly to better understand their IT Service Management solution on Dynamics 365.

Modern Service Management References

Below are just some of the videos, documents and blog articles on Modern Service Management;

Modern Service Management

Modern Service Management for Azure Whitepaper

Channel 9 Video

The Release Pipeline

An example of Release Pipeline automation using PowerShell

The Release Pipeline Model video

Modern Service Management Customer Service and Support

Next-Generation Service and Support in a Mobile-First, Cloud-First World, Part 1

Next-Generation Service and Support in a Mobile-First, Cloud-First World, Part 2

AXELOS Blogs on Modern Service Management

Service Monitoring Service Outputs

Service Monitoring Service

Building Trust in the Service Monitoring Service.

Making the Service Viral

Service Monitoring Application Development

Monitoring Service Health.

The service monitoring service – rounding it all up

Modern Service Management for Office 365

Modern Service Management for Office 365

Monitoring and Major Incident Management

Monitoring: Audit and Bad-Guy-Detection

Leveraging the Office 365 Service Communications API

Evolving IT for Cloud Productivity Services

IT Agility to Realize Full Cloud Value - Evergreen Management

Service Desk and Normal Incident Management for Office 365

 

About the Author

John Clark is a "Modern" IT Service Management Solution Architect in the Microsoft Enterprise Services / Americas IT Service Management practice and Community Lead for the Modern Service Management Worldwide Community.   John contributions to Microsoft have been recognized from being selected as TechReady19 WW Communities Award Winner, member of the Microsoft Services Senior Technology Leadership Program (STLP), and recently achieving 2016 Gold Club.  John was formerly President of the Ohio Valley itSMF LIG winning the annual "LIG Of the Year" award during his leadership from itSMF USA.  John has extensive background in Technology Management and IT Service Management, as well as Dynamics 365, Office 365, Hybrid Cloud, Azure and System Center.  He has published articles in various industry journals, co-authored a book on System Center , and has spoken at various industry tradeshows on ITSM, Enterprise Architecture, Business Process Management and Design.

SharePoint: User profiles are imported with wrong domain name

$
0
0

In certain domain configurations, User Profiles can be imported with the incorrect domain name.
For example: account names are supposed to shown as CORPUser1, but profiles are imported as contosouser1

Note: This applies to both SharePoint Profile Synchronization (aka: FIM Sync) and Active Directory Import (aka: AD Import).

 

What's the impact?

There are a few problems this creates.

Since the profiles have account names at the User Profile Service Application (UPA) level that don't match the account names at the site-level, the "<UPA Name> - User Profile to SharePoint Full Synchronization" timer job (aka: WSSSync) cannot synchronize profile data down to each site, meaning that users job titles, etc will not get updated at the site level.

When users browse to their mysite, or any profile pages, a new "stub" profile will get created instead of using the existing profile that was imported.  For more about these "stub" profiles, see this.

 

How does this happen?

This occurs because the DNS name of the domain is different from the NETBIOS name of the domain.
In the example above:
DNS = contoso.com
Netbios = corp

The profiles are being imported using the DNS name instead of the Netbios name, which is not correct.

If there is any doubt as to whether or not the names are different, you can look in Active Directory Users and Computers.  Right-click on the domain object and choose properties.  The DNS name will be listed at the top of the dialog.  The Netbios name will be listed in the “Domain name (pre-Windows 2000)” box.

Note: This only applies if the left-most portion of the DNS name is different from the Netbios name. For example: if the DNS name is “contoso.com” and the NetBIOS name is “contoso”, that is a match, and none of this applies.

How to fix it?

You must set the NetBiosDomainNamesEnabled property to “true” on the User Profile Service Application (UPA) object.
Once this is done, you must delete and recreate the Sync connection.  That is the only way to get the above change to take effect.  It also helps to stop and restart the Sync service in between.

Note: If there are multiple Sync connections, only the connection that includes the domain that needs the NetBiosDomainNamesEnabled property should be recreated.

Warning: This not a trivial matter. There are several things that need to be re-configured when recreating the Sync connection.
-- The AD Sync connections, including Domains and OUs selected.
-- Any import connection filters.
-- Profile property mappings.  These include custom mappings that have been made to out-of-box properties, and mappings made to custom properties.

You’ll need to document the settings for all of the above so that the Sync connection can be properly reconfigured.
So here are the steps:
-- Set the property:

$UPA = Get-SPServiceApplication | ? {$_.Typename -eq "User Profile Service Application"}
$UPA.NetBIOSDomainNamesEnabled = $true
$UPA.update()
-- Disable the My Site Cleanup Job (timer job) until you’ve run a few Syncs successfully.
-- Delete the existing Sync connection.
-- Stop and restart the User Profile Synchronization Service (from Central Admin). -- This isn't always required, but helps to be thorough.
-- Re-create the Sync connection, including filters and property mappings.
-- Verify permissions for the Sync account (see additional tips #1).
-- Verify the connection was created correctly (see additional tips #2 -- only applies to FIM Sync).
-- Run a Full Sync.

Additional Tips:

#1  Now that NetBiosDomainNamesEnabled is set, the Sync account may need additional permissions in AD.  See:
http://technet.microsoft.com/en-us/library/8451dde9-bbd1-4285-bc24-71bd795fb912#permission
Particularly, this part is important:
“If the NetBIOS name of the domain differs from the fully qualified domain name, the synchronization account must have Replicate Directory Changes permission on the cn=configuration container. For example, if the NetBIOS domain name is contoso and the fully qualified domain name is contoso-corp.com, you must grant Replicate Directory Changes permission on the cn=configuration container”
So in summary, if it isn’t set already, the Sync account needs to have the “Replicate Directory Changes” permission on the Configuration container for each domain.

 

#2  Once you have re-created the Sync connection, you can check in the FIM client if the NetBIOSDomainNamesEnabled change was properly applied.

Open the FIM client (miisclient.exe) on the Synchronization server and click Management Agents.
Right click on the Active Directory MA (the one of type “Active Directory Domain Services” and choose properties.
Click “Configure Directory Partitions”.  You should see each domain selected along with the “CN=Configuration” container selected.  Example: CN=Configuration,DC=Contoso,DC=com.
If the configuration container is not selected, the NetBIOSDomainNamesEnabled change was not properly applied.

Microsoft 365 Business Part 4 – Office 365 Business Deployment

$
0
0

In the earlier posts in this series we covered Windows 10 Business, Azure Active Directory and Windows AutoPilot, and now we move over to the deployment of Office 365 Business. In the last post on Windows AutoPilot we saw that this gets installed automatically as part of the Azure Active Domain Join and autoenrollment into Intune, and today we'll take a look at enabling this deployment, as well as what is going on behind the scenes with Intune.

From the Microsoft 365 Portal we choose Manage Office deployment.

First of all, we need to select who to assign this to. We will keep this simple today by using the inbuilt groups, rather than doing more selected targeting.

In this case we have the All Users group, but if we had more groups created they would appear here.

Select All users

A couple of things to highlight here – first of all that this is targeting Intune enrolled devices. How can we tell? Because we are doing an automated application installation, as opposed to just applying policies to an already installed application. The second point is that we only have the options to Install Office as soon as possible, which I have highlighted, or to Uninstall Office.

Take a second to review the changes.

And then we can close the window. That's it. If you've previously deployed Office 365 desktop apps via the Office Deployment Tool or Intune, you probably realise there were a large number of options that you weren't presented with, so how do you know if the defaults that were selected make sense for you. That's easy, we can just jump in to the Azure Portal.

After opening the Intune blade, select Apps, and you can see something a little peculiar – the TYPE column for the Office Desktop Suite shows Office 365 Pro Plus Suite (Windows 10). The version of Office that is installed converts to Office 365 Business when it is automatically activated, so don't worry about the Pro Plus licence not activating against a Business license, this is transparent.

Selecting Office Desktop Suite and then Properties shows us the three pre-configured property areas.

Configure App Suite shows
that we could do select install of the Office suite components, but in this case we want a full install. The second thing to notice here is that we could also have Project and Visio show as installed, but you would need to have purchased and assigned those licences separately. This is not a change you would usually make in the default Office installation settings, instead you would create a new group and target that group instead.

Under App Suite Information we have some prepopulated options, but again the recommendation here is to not change the settings that are in here, instead create a similar Mobile Apps policy but with your required settings.

App Suite Settings is where things really get interesting though, this is where Microsoft 365 Business is making decisions that are designed to be most beneficial across a variety of scenarios – thus the 32 bit installation, Monthly updates, acceptance of the EULA and setting up for single user activation, not shared computer/RDS installation. What other options would we normally see here? Let's take a look.

Before I explain these options, I need to highlight that I am not editing the base configuration, I've just taken a screenshot of a new deployment. As the names of these releases has evolved the names in the drop down have also changed. If you take a look at my earlier post on this topic you can see this. If you need to learn more, take a look at Overview of update channels for Office 365 ProPlus. If you want to get an idea of just how much Office 365 desktop apps change over time, take a look at Office 365 client update channel releases.

The final section, Assignments, shows what we already know, that the All Users group has an install type of Required.


STEM Jobs Of Tomorrow

$
0
0

Both the Education and the Technology sectors are full of buzz words and acronyms, with STEM (Science, Technology, Engineering, Maths) being one of the biggest ones out there.

This morning I’ve returned from a couple of weeks of vacation and seen a new website aimed at linking the jobs of the future with STEM:

http://www.makewhatsnext.com/careers/

The interesting part of this, for me, is that it’s powered by currently advertised roles on LinkedIn. You can choose two of your passions and it will then show you some possible jobs.

TURN YOUR PASSION INTO ACTION

You may not know it yet, but your interests today could be a science,technology, engineering or math (STEM) job of tomorrow.With insights from LinkedIn, explore jobs that can help you change the world.

selection of passions.PNG

By clicking on two of the above areas of interest, possible jobs are shown based off current listings on LinkedIn

I chose Sports and Technology when trying this out and was presented with the following:

sports tech.PNG

The stats are pretty sobering:

By 2018, it is projected that 2.4 million STEM jobs will go unfilled. Today, 42% of open STEM jobs on LinkedIn require computer science or coding skills.

No matter your passion, computer science skills can help you make a change in the world. Start exploring resources to gain these skills today.

From an education perspective, it is vital that our students of today are getting the skills they will need for tomorrow. If you work in education or have kids of your own, encourage them to check out the above website and receive some ideas about future careers based on their current passions, as well as possibly some motivation to stay connected in the STEM skills.

AI を活用したソリューションによってサービスを強化[10/9 更新]

$
0
0

(この記事は2017  年 8 月 22 日にMicrosoft Partner Network blog に掲載された記事 Extend Your Services with AI-Powered Solutions の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

 

近年、AI や認知サービス関連の画期的なテクノロジは市場を大いに沸かせ、お客様の注目を一身に集めています。これにより、効率性の向上、インサイトの獲得、イノベーションを行うための新たな戦略に再び取り組むチャンスが生まれました。

 

Barry Briggs (英語) と共同で執筆した『Enterprise Cloud Strategy』の中で述べたように、AI の復活を支えた大きな要因は、クラウドによって膨大なストレージ容量と処理能力を利用できるようになったことです。マイクロソフトのテクニカル フェローであり、Microsoft Research Cambridge Lab の研究所長を務める Christopher Bishop (英語) が言うところの「コンピューティング史上最も重要な変革」の原動力は、クラウド規模でデータ分析が可能になったことに間違いありません。

Bishop によると、この変革のきっかけは次の 2 つの重要な変化だと言います。

  • 人間が開発したソフトウェア ソリューションから、データから学習するソリューション (機械学習) への転換
  • 論理としての計算という考え方から、可能性を通じて不確実性を表現する計算 (予測分析と機械学習) という考え方への転換

 

IDC の調査 (英語) によると、デジタル トランスフォーメーションの取り組みに対する投資は 2019 年までに 2.2 兆ドル (今年の約 60% 増) に達します。また、2018 年には開発チームの 75% (英語) が 1 つ以上のアプリケーションに認知機能と AI 機能を組み込むことが予想されます。これは、マイクロソフト パートナー様にとって、自社のサービスを活用または拡大する絶好のチャンスです。

 

研究成果とプラットフォームの活用

マイクロソフトは、めざましく発展する AI 研究の成果と強力なクラウドを活用して、企業や開発者が独自の製品やサービスに AI を柔軟に組み込むことができるプラットフォームを提供しています。Microsoft Cognitive ServicesAzure Machine LearningBot Framework (英語) などのツールやサービスは、魅力的な AI サービスを構築するための研究成果を反映した優れたプラットフォームです。パートナー様は、コンピューター ビジョンや自然言語処理といった AI 分野のテクノロジを統合し、データや経験から学習するエンドツーエンドのシステムを構築することができます。

Microsoft Research AI (英語) は、優れた研究成果を公表し、多数のプロジェクトを通じて実用的なアプリケーションを開発しています。たとえば、AI によってテキストの内容を理解するマシン リーディング (英語) のプロジェクトは、カスタマー サポートや CRM に応用できる可能性を大いに秘めています。医療分野では、マシン リーディング システムを活用して、医師が病気に関する情報をすばやく見つけたり、システマティック レビュー (特定の研究課題について、既存の論文をくまなく調査・分析すること) にかかる労力を軽減したりできる可能性があります。

 

IoT と機械学習の融合

ビッグ データや機械学習と IoT が融合することで、膨大な数の IoT デバイスから新たなインサイトが得られ、結果として膨大な量のデータも生成されます。ここに、パートナー様のビジネス チャンスが生じます。

以前は製造業に主眼を置いていた IoT ですが、今や IoT デバイスは至るところに存在します。IoT データを分析し、その結果に対応するための新しい手法が生まれたことで、まったく新しいビジネス モデルが実現しています。

ガートナーによると、全世界の IoT デバイスの数は年内に 84 億台 (英語) に達する見込みです。さらに、IDC の予測 (英語) によれば、2019 年には、成果を収める IoT プロジェクトはすべて認知機能または AI 機能を活用したものになります。パートナー様にとっては、これまでにない規模のデータを活用して IoT のインサイトを提供する新しいソリューションを構築できる大きなチャンスです。

これらのインサイトを特定の業界や業種で活用するためには、パートナー エコシステムが必要です。たとえば、自動運転を可能にするために、交通量や道路状況、車両性能などを認識するセンサーを搭載した自動車をご存知でしょう。この自動車は、IoT を活用したもので、機械学習アルゴリズム、コンピューター ビジョン、その他のセンサーの情報を活用して状況を判断し、センサーの情報に対応します。さらに、クラウドから得たリアルタイムの情報を組み合わせることで、目的地に短時間で安全に到着できるルートを決定します。

 

ボットの活用

自然言語によって人間とコミュニケーションを取り、人間の代わりに自動的に行動するアプリケーションは「ボット」と呼ばれます。新しい形の顧客関係を実現し、顧客との距離を縮める効果が期待されています。膨大な量のデータが保存されている状況では、顧客が目的の情報を探すことが困難になる場合があります。そんなとき、必要な情報がある場所をボットにたずねれば、複雑なリンクをたどる必要もなくなります。Business Insider (英語) の予測によると、ビジネス向けボットは今年最大のテクノロジ トレンドの 1 つになる見込みです。

ボットを企業のデータ ソースに接続することで、ユーザーがすばやくタスクを実行できるだけでなく、データの新たな活用方法も生み出されます。先日、マイクロソフト パートナーでイベント アプリ プロバイダーの Eventbase がイベント向けボットの「Abby」を発表 (英語) しました。Abby は、会話形式のユーザー エクスペリエンスを提供するように設計されており、イベント参加者が広い会場内を目的地まで移動できるようにサポートします。また、ユーザーの好みや状況を理解して、それに応じたコンテンツを提供したり、お勧めのセッションを提案したりします。

 

医療分野における AI

Microsoft India は、L V Prasad Eye Institute と共同で (英語) Microsoft Intelligent Network for Eyecare (MINE) を立ち上げました。これは、AI を活用して回避できる失明をなくし、目の治療サービスを全世界に提供することを目標に掲げ、有志の企業、研究機関、教育機関で構成される国際コンソーシアムです。パートナー機関の協力を得て、世界中の多様な患者データを基に視覚障害や目の病気を予測する機械学習モデルの構築に共同で取り組むことを目指しています。

中国では、医療用 AI のスタートアップ企業 Airdoc (英語) が、Microsoft Azure のクラウド サービス、Microsoft Cognitive Services、Microsoft Cognitive Toolkit (英語) を活用し、患者の網膜の写真から糖尿病性網膜症の発症を迅速かつ正確に発見するテクノロジを開発しました。この病気は糖尿病の合併症で、適切な治療を行わなければ失明につながる可能性もあります。

マイクロソフトは先日、Microsoft Cognitive Toolkit バージョン 2.0 をリリースしました。この無料のオープンソース ツールキットは、人間の脳の学習方法を参考にして、ディープ ラーニング アルゴリズムのトレーニングを行うものです。Cognitive Toolkit に加えて、開発者の皆様は Microsoft Azure のクラウド アプリケーション スイートにもアクセスし、Microsoft Cognitive Services の機械学習 API などを活用できます。

予測分析プロバイダーの Cognisess (英語) は、Azure と Microsoft Cognitive Services を活用し、企業が従業員や採用希望者の将来的な業績、維持可能性、人材開発のニーズを予測できる業績分析プラットフォームを開発しました。この機械学習と AI を活用したソリューションでは、8 種類の能力分野と 100 種類以上の属性について、何百万ものデータ ポイントの計算を実行することができます。

 

マイクロソフトは、AI、機械学習、IoT の進歩を加速させると共に、これらのテクノロジをさらに利用しやすくして、ビジネス チャンスをパートナー様と共有することを目指しています。

皆様は、クラウド規模の膨大な量の情報の中から顧客が必要な情報を見つけられるように、どのように機械学習や IoT、認知サービスを活用していますか。ぜひマイクロソフト パートナー コミュニティ (英語) でご意見をお聞かせください。

 

 

 

Giving minimum access privilege using Service Principal

$
0
0

Sometimes, you may want to provide the minimum privilege for some Azure resource. I'd like to explain how to do it using Log Analytics search minimum access.

 

Create a Service Principal

The most easiest way is using Azure CLI 2.0.  This command create an Service Principal for Azure. You can choose any name and password of the Service Principal. If you use the service principal, you can access almost any resource of your subscription.

 

$ az login

$ az ad sp create-for-rbac --name SPTestSP --password oMEDCPzT4vE0zIwY
WARNING: Retrying role assignment creation: 1/36
{
 "appId": "43d662d0-41af-41dc-bf4f-0cb0f1f93b3c",
 "displayName": "SPTestSP",
 "name": "http://SPTestSP",
 "password": "oMEDCPzT4vE0zIwY",
 "tenant": "YOUR_TENANT_ID"
}

 

Now you can see your new Application on the Azure Portal.
See Azure Active Directory > App Registrations or YOUR_LOG_ANALYTICS > Access control (IAM)
You can find the new Application

Also, you can do the it via Azure Portal. However it takes a lot of steps compared with az command.

If you don't have multiple tenant, you can access your Log Analytics with your Service Principal.

If you want to try, I provide a sample to access Log Analytics. Refer the read me.

Remove the role

However, the default Role is Contributor. It is too strong. Let's remove it.

$ az role assignment delete --assignee 43d662d0-41af-41dc-bf4f-0cb0f1f93b3c --role Contributor

Now you can see it has been removed.

Assign New Role

Go to YOUR_LOG_ANALYTICS > Access control (IAM)  Then push (+ ADD) button.

Now you can choose a lot of roles. See List of Roles of Service Principal

Configure it and save it.

I choose Log Analytics Reader. Also try Monitoring Reader.  Both works. (Since I can't find the spec of the Log Analytics, I tried both. Balloon tips said Monitoring Reader might minimum.

Now you can see it has been configured as Log Analytics Reader.

Since this is Log Analytics specific Role, I can't find it via az command.

$ az role assignment list --assignee 43d662d0-41af-41dc-bf4f-0cb0f1f93b3c
[]

Let's try

After reading the Readme and configure it, let's try.  Works!

Then, remove the Service Principal  form the YOUR_LOG_ANALYTICS > Access control (IAM) 

You'll get 403: Forbidden. 🙂

For my safety I removed the Service Principal. Now I'm safe. 🙂

Resource

Changing between Windows SKUs

$
0
0

In one of the sessions that we did at Ignite around Windows 10 Subscription Activation, we included this slide that talked about the progression that has been made through the years in simplifying the process of changing the installed SKU of Windows, primarily between Windows Pro and Windows Enterprise:

History

Judging from the reaction to my other blog that talked about the one new path that we added with Windows 10 1709 to easily change between Windows 10 Enterprise and Education SKUs, it appears that there are lots of people that aren’t familiar the options that are available, or even how those options changed over the years.

To remedy that, let’s give you a history lesson, similar to what’s in the Ignite session recording:

  • With Windows 7, if you wanted to change from Windows 7 Professional to Windows 10 Enterprise, you had to redeploy the operating system – a full wipe-and-load process.
  • With Windows 8.1, we added support for a Windows 8.1 Pro to Windows 8.1 Enterprise in-place upgrade (considered a “repair upgrade” because the OS version was the same before and after).  This was a lot easier than wipe-and-load, but it was still time-consuming.
  • With Windows 10 1507, we added the ability to install a new product key via a provisioning package or using MDM to change the SKU.  This would require a reboot, which would install the new OS components.  This took several minutes to complete, but it was a lot quicker than in-place upgrade.
  • With Windows 10 1607, we made a big leap: Now you can just change the product key and the SKU instantly changes from Windows 10 Pro to Windows 10 Enterprise.  In addition to provisioning packages and MDM, you can just inject a key using SLMGR.VBS (which just injects the key into WMI), so it’s now trivial to do this via a command line too.
  • With Windows 10 1703, we made this “step-up” from Windows 10 Pro to Windows 10 Enterprise automatic for those that subscribed to Windows 10 Enterprise E3 or E5 via the CSP program.
  • With Windows 10 1709, we added support for Windows 10 Subscription Activation, very similar to the CSP support but for large enterprises, enabling the use of Azure AD for assigning licenses to users; when those users sign in to an AD or Azure AD-joined machine, it automatically steps up from Windows 10 Pro to Windows 10 Enterprise.

Enough history lesson then.  Let’s switch to “how-to” and focus on the most common scenarios today.

Scenario #1:  Using KMS for activation, just purchased Windows 10 Enterprise E3 or E5 subscriptions (or for some reason have had an E3 or E5 subscription for a while but haven’t yet deployed Windows 10 Enterprise), and you are using Windows 10 1607 or above.

All you need to do to change all of your Windows 10 Pro machines to Windows 10 Enterprise is to run this command on each machine:

cscript.exe c:windowssystem32slmgr.vbs /ipk NPPR9-FWDCX-D2C8J-H872K-2YT43

In case you are wondering, that key came from https://technet.microsoft.com/en-us/library/jj612867(v=ws.11).aspx.  It causes the OS to change to Windows 10 Enterprise and then seek out the KMS server to reactivate.  (And yes, if you really want to, you can inject the Windows 10 Pro key from that page to step back down from Enterprise to Pro.)

Scenario #2:  Using Azure AD-joined devices or Active Directory-joined devices, with Azure AD synchronization set up (“hybrid join,” as discussed in the video above), and you are using Windows 10 1709 or above.

All you need to do is to follow the steps in https://docs.microsoft.com/en-us/windows/deployment/windows-10-enterprise-subscription-activation to acquire a $0 SKU to get a new Windows 10 Enterprise E3 or E5 license in Azure AD, then assign that license to all of your Azure AD users (which can be AD-synced accounts).  Then the device will automatically change from Windows 10 Pro to Windows 10 Enterprise when that user signs in.

Summary

If you have a Windows 10 Enterprise E3 or E5 subscription, but are still running Windows 10 Pro, it’s really simple (and quick) to move to Windows 10 Enterprise using one of the scenarios above.

If you’re running Windows 7, it can be a lot more work.  While a wipe-and-load approach works fine, it would be easier to upgrade from Windows 7 Pro directly to Windows 10 Enterprise (yes, that’s a supported path) which takes care of the move in one hop.  (That works for Windows 8.1 Pro as well.)

Fire steg til hvordan du kan oppnå en sky kompetanse!

$
0
0

 

 

 

Microsoft partnerprogram er bygget opp av tre medlemskapsnivåer, nettverk medlemskap, Microsoft Action Pack subscription (MAPS) og Microsoft Competencies på sølv eller gull nivå.

Det er helt gratis å opprette et nettverk medlemskap og dette kommer med grunnleggende fordeler som gir mulighet for gratis oppæring og trening for å komme i gang. På dette nivået vil man også få tilgang til ressurser for å komme i kontakt med potensielle kunder.

MAPS er et abonnement på årsbasis som koster ca US$475. På dette medlemskapsnivået vil man få tilgang til utviklerverktøy, interne opplærings materiale, samt tjenester og programvare som bedriften kan bruke for å teste med sine egne løsninger før man kommer til utrullingsfasen hos en kunde.

Det tredje medlemskapsnivået går på kompetanse opptjening. Dette betyr at du som partner oppnår kompetanser på et elite nivå som viser at du har ekspertise i Microsoft-teknologi. Kompetansene opparbeides ved å ta eksamener eller utføre online tester, at du kan vise til kundereferanser som kunden selv registrerer, og vise til at du som partner driver kundekonsumering eller har en andel kunder som bruker dine løsninger som er knyttet opp til kompetansen. Kravene er forskjellige i størrelse for hver kompetanse i deres egne nivåer på sølv og på gull.

Per i dag finnes det 17 forskjellige kompetanser i Microsoft partnerprogrammet, og utfra disse finnes det fem skykompetanser. Hver av disse fem skykompetansene er basert på et Sky produkt av Microsoft, og for å oppnå en av skykompetansene er det fire krav:

 

Demonstrere salgsomsetning

Hver kompetanse har sine egne krav, og avhengig av hvilken strategisk retning du har er det påkrevd at du som partner kan vise til skyomsetningen som du driver. Det finnes to forskjellige måter å tjene omsetning via dine kunder;

  • Digital Partner of Record (DPOR) er måten partnere viser at de bistår kunden med å ta i bruk teknologien. Dette betyr at kunden nominerer deg som partner, som har utført utrulling av produktet hos dem. Det å være en DPOR betyr at du som partner kan ha mulighet til å opptjene insentiver i form av penger som kan brukes i bistand med prospektive prosjekter. Les mer on den nyeste oppdateringen på dette her.
  • 'Partner Association' tillater kunden til å assosiere partnere som ikke er DPOR til å ha 'Delegated Admin Privileges' (DAP). Dette betyr at du som partner kan likevel kvalifisere din omsetning i oppnåelse til en skykompetanse.

 

Bestå tekniske eksamener

Å være teknisk opplært i Microsoft-teknologi bærer mye tyngde for kunder som leter etter godt kvalifiserte partnere. Hver kompetanse har et sett med vurderinger, eksamener og sertifiseringer som partneren er påkrevd til å ferdigstille. Disse tekniske krav forandrer seg ettersom produkt tjenester og krav i markedet endrer seg. Oversikt over kurs materiale for diverse eksamener og sertifiseringer finnes via Partner University Learning Paths.

Etter at et individ fullfører og består en eksamen eller sertifisering vil de bli en godkjent Microsoft Certified Professional (MCP) og vil bli tildelt en ID assosiert til dem. For at hver eksamen skal kunne godkjennes i oppnåelse mot en kompetanse er det viktig at denne personens MCP ID blir assosiert med organisasjonens MPN ID. Her kan du finne instruksjoner på hvordan man kan assosiere seg til en organisasjons i MPN.

 

Vise til kundereferanser

Som en del av stegene for å bli en gjenkjennelig sertifisert partner, er det å kunne bygge tillitt hos kunden. Microsoft krever at an partner kan vise til en kunde referanse i form av en bedrift fokusert kompetanse vurderinger. Dette gjøres ved at kunden gir en tilbakemelding på at de har jobbet med deg og at de er fornøyde med din leveranse. Referansene må bli godkjente av alle partier og av Microsoft før den kan bli brukt for opptjening til en kompetanse eller fornying av det i opptil to år. De fleste sølvkompetanser krever tre kundereferanser, mens gullkompetanser krever fem referanser.

 

Betale dekning av abonnementet

Dekning til abonnementet betales en gang i året uavhengig av hvor mange kompetanser ditt selskap opptjener i løpet av året. I FY17 var skykompetansene subsidiert (US$1,530 for Sølv og US$3,940 for Gull). Fra og med 1.oktober 2017 ble alle skykompetanse priser omgjort til å være lik pris som alle de andre kompetanser (US$1,670 for Sølv og US$4,730 for Gull). Denne prisendringen vil gjelde for partnere ved fornying av deres medlemskap.

Som fordel ved å oppfylle disse fire krav, vil partnere få følgende fordeler, i tillegg til grunn fordeler.

  • Utvidet skysupport
  • Økt antall brukslisenser
  • Garantert konto oppfølging
  • Berettigelse for insentiver

 

Tenker du på å bli sertifisert i Microsoft-teknologi men trenger hjelp med å komme i gang?! Cloud Enablement Desk (CED) kan bistå med å komme opp til sølv/gull status, hjelper deg med å utvikle virksomheten, samt få teknisk ekspertise innen salg av ledende skyløsninger. CED gir tilgang til en Cloud Program Specialist (CPS) i opptil seks måneder hvor du vil bli veiledet gjennom alle trinnene som vil gi sølv/gull kompetanse, og tilbyr alle tekniske og forretningsmessige ressurser og verktøy som trengs underveis.

For å være kvalifisert til å motta Cloud Enablement Desk-støtte, må du som partner;

  • Være et aktivt medlem av Microsoft Partner Network med en MPN-ID
  • Ha gjennomført minst minst `en skyhandel
  • Ikke ha en Microsoft Partner Sales Executive-kontaktperson (Partner ansvarlig).
  • Ikke ha en Microsoft skykompetanse

Ønsker din bedrift bistand med opptjening av sky kompetanser kan du nominere ditt selskap til Cloud Enablement Desk støtte her

 

Start ditt medlemskap i dag via MPN hjemmeside og utforsk dine muligheter.

Følg oss på disse MPN Norge kanalene for stadige oppdateringer på nyheter, kursing, tips og kom i kontakt med andre partnere:

Facebook

Yammer

Twitter

Viewing all 34890 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>