Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

WDAGUtilityAccount

$
0
0

If you see an alert in your log solution for a WDAGUtilityAccount user account being created (event id 4720 or 4722), this is the Windows Defender Application Guard account included with RS3 (aka windows 10 fall update), the account is provisioned as disabled.

Basically you have user enrolled in the Windows 10 insider program.

More on this coming later.


[無料ダウンロード]クラウドで仕事を効率化する 7 つの方法 (e-Book)【7/16更新】

$
0
0

 

ビジネスを成功させるには、共同作業とコミュニケーションが重要です。ただし、実際に異なる生産性向上ツールを使う従業員や顧客と共に作業を完了することは困難です。

無料でダウンロードできるe-bookを使ってクラウドを最大限に活用する方法を学習し、次の事を実現する方法を確認してください。

・コミュニケーションと共同作業を改善する

・会社のモビリティを高める

・データに洞察を加える

 

e-book「クラウドで仕事を効率化する 7 つの方法」の無料ダウンロードはこちらから

 

Tip of the Day: Windows 10 S FAQ

$
0
0

We interrupt late week's rundown of top Defrag Tools episodes to bring you something I found in my inbox:

You may or may not have heard buzz around Windows 10 S.  If you haven't, Windows 10 S is a specific configuration of Windows 10 Pro that offers a familiar, productive Windows experience that's streamlined for security and performance. By exclusively using apps in the Windows Store and ensuring that you browse safely with Microsoft Edge, Windows 10 S keeps you running fast and secure day in and day out.

For a list of common FAQs around Windows 10 s see Windows 10 S FAQ

Dealing with the Report Server Report Rendering Object Model

$
0
0

After creating a working stub of a Custom Rendering Extension, it was time to add the actual functionality. In order to do so, the Report Server Object Model used for Report Rendering has to be understood. In this blog post, I want to share some insights gained during this process. It will hopefully help to speed up the process of getting in touch with the Report Server Object Model, but isn't a replacement for existing reference and the need of digging deeper by yourself.

Namespaces

Depending on the Type of Reporting Extension (Processing, Rendering, Delivery etc.), different Namespaces are used. The Types itself are similar, but not equal. E.g. the Processing Extension uses the Microsoft.ReportingServices.RdlObjectModel-Namespace, containing details for the Report Parameter Layout - irrelevant for Report Rendering and therefore missing in the Microsoft.ReportingServices.OnDemandReportRendering-Namespace.

This Blog Post is focused on the Report Rendering Object Model. Nevertheless, it is useful to identify the corresponding element and structure in the Report Definition and then have a look after the corresponding elements in the Rendering Object Model.

E.g. Table => Tablix-Type - SSDT and RDL-File:

Type Hierarchies & Structure

Report Structure

The overall report is structured into Report Sections, containing Report Items in their Report Body:

Report Item Hierarchy

Report Items are the visual elements, dragged into a report. The Type for Table, Matrix and List is all Tablix. Also Data Bar and Chart are both of Type Chart.

Report Element Hierarchy

Report Element is the Base Type of Report Item with some other derived Types used for internal Report Structure.

Basic Type Structure

There are two kind of types, the ones for Report Definition, containing the uncalculated expressions, e.g. "=Fields!xyz.Value", and the ones with the actual calculated values. The latter usually have the Type-Postfix "Instance", e.g. "TextBox" vs. "TextBoxInstance". The Types for actual instances are much more differentiated than the definition ones in order to differentiate between static and dynamic ones, e.g. in case of a table, header columns are static and data bound columns are dynamic - requiring the row-by-row processing of the underlying data.

The following three types are the base Report Definition Types for many other types in the Report Object Model, containing some basic information of all elements:

Dynamic Instances / Row-Processing

If an element is bound to a dataset, the instance contains the value of the current row. To get the values of all rows, they have to be processed.

Common methods, e.g. according to TablixDynamicMemberInstance:

  • void ResetContext() => Reset before first instance
  • bool MoveNext() => Move to next instance, return true if successful
  • int GetInstanceIndex() => Index of current instance
  • bool SetInstanceIndex(int index) => Move to specific instance, return true if successful

Types supporting these methods:

  • ChartDynamicMemberInstance
  • DataDynamicMemberInstance
  • GaugeDynamicMemberInstance
  • MapDynamicMemberInstance
  • TablixDynamicMemberInstance

To process all instances, simply call ResetContext() and then process each instance inside a while(instance.MoveNext()). Depending on the complexity of the Report Item, several nested processings may be required to get all data, e.g. in a Tablix with dynamic rows and columns.

Understanding Tablix and Chart

Tablix and Chart are the common "more complex ones" of the Report Items.

Tablix

A tablix may contain several row- and column-hierarchies. Each of them may contain several TablixMember, which may be nested. To get a look on all items in SSDT, switch to Advanced Mode:

Depending on the nature of the Tablix Member - either static or dynamic (Property TablixMember.IsStatic), its instance is either of Type TablixMemberInstance or TablixDynamicMemberInstance.

Matrix-Example

Report Definition:

Report Preview:

Diagnostics Output:

- Tablix1: Type = Tablix
, Rows = 1, Columns = 1
=== Row Hierarchies ===
 - 0: isStatic = False
=== Column Hierarchies ===
 - 0: isStatic = False
=== Stats: dynamic rows = 1, columns = 1 ===
=== Tablix Data ===
Data in format row hierarchy | column hierarchy | row recursion level | column recursion level | dataRowCounter-Prefix: val-1, val-2 ...
0 | 0 | 1 | 0 | /1/1:458, 487, 498, 274
0 | 0 | 1 | 0 | /1/2:1.159.731, 1.175.475, 1.201.006, 646.509
0 | 0 | 1 | 0 | /2/1:342, 395, 416, 167
0 | 0 | 1 | 0 | /2/2:778.424, 969.668, 962.583, 381.662
0 | 0 | 1 | 0 | /3/1:350, 335, 367, 127
0 | 0 | 1 | 0 | /3/2:859.417, 779.221, 804.262, 351.505
0 | 0 | 1 | 0 | /4/1:229, 251, 296, 139
0 | 0 | 1 | 0 | /4/2:514.022, 617.589, 757.623, 410.201
0 | 0 | 1 | 0 | /5/1:1.027, 1.115, 1.248, 488
0 | 0 | 1 | 0 | /5/2:2.448.470, 2.685.292, 2.888.463, 1.128.373
0 | 0 | 1 | 0 | /6/1:454, 571, 643, 258
0 | 0 | 1 | 0 | /6/2:1.161.575, 1.330.468, 1.540.239, 586.586

Code to get the Diagnostics Output:

public static void DebugItem(Tablix tablix)
{
	Debug.WriteLine(string.Format("- {0}: Type = {1}",
			tablix.Name,
			tablix.GetType().Name
			));
	Debug.WriteLine(string.Format(", Rows = {0}, Columns = {1}", tablix.Body.RowCollection.Count, tablix.Body.ColumnCollection.Count));

	int i = 0;
	Debug.WriteLine("=== Row Hierarchies ===");
	List<TablixMember> dynamicRows = new List<TablixMember>();
	foreach (TablixMember m in tablix.RowHierarchy.MemberCollection)
	{
		Debug.WriteLine(string.Format(" - {0}: isStatic = {1}", i++, m.IsStatic));
		if (!m.IsStatic)
		{
			dynamicRows.Add(m);

		}
	}

	i = 0;
	Debug.WriteLine("=== Column Hierarchies ===");
	List<TablixMember> dynamicColumns = new List<TablixMember>();
	foreach (TablixMember m in tablix.ColumnHierarchy.MemberCollection)
	{
		Debug.WriteLine(string.Format(" - {0}: isStatic = {1}", i++, m.IsStatic));
		if (!m.IsStatic)
		{
			dynamicColumns.Add(m);
		}
	}

	// Print data: if classic table
	Debug.WriteLine(string.Format("=== Stats: dynamic rows = {0}, columns = {1} ===", dynamicRows.Count, dynamicColumns.Count));

	Debug.WriteLine("=== Tablix Data ===");
	if (dynamicColumns.Count == 0)
	{
		// Process rows
		foreach (TablixMember m in tablix.RowHierarchy.MemberCollection)
		{
			if (m.IsStatic)
			{
				Debug.WriteLine(CreateRowCsv(tablix, m, ";"));
			}
			else
			{
				TablixDynamicMemberInstance dynamicInstance = (TablixDynamicMemberInstance)m.Instance;
				dynamicInstance.ResetContext();
				while (dynamicInstance.MoveNext())
				{
					Debug.WriteLine(CreateRowCsv(tablix, m, ";"));
				}
			}
		}
	}
	else
	{
		Debug.WriteLine("Data in format row hierarchy | column hierarchy | row recursion level | column recursion level | dataRowCounter-Prefix: val-1, val-2 ...");
		// Process rows
		foreach(TablixMember mR in tablix.RowHierarchy.MemberCollection)
		{
			DebugRowHierarchy(tablix, mR, 0, "");
		}
	}
}

/// <summary>
///  recursively process all row hiearchies, afterwards all column hierarchies and then the values itself
/// </summary>
/// <param name="tablix"></param>
/// <param name="hierarchy"></param>
/// <param name="rowDepth">0..n, depth of row: 0 = starting level, 1 = child, 2 = child of child... </param>
/// <param name="colDepth">0..n, depth of col: 0 = starting level, 1 = child, 2 = child of child... </param>
private static void DebugRowHierarchy(Tablix tablix, TablixMember rowMember, int rowDepth, string dataRowCounterPrefix)
{
	if (rowMember.Children == null || rowMember.Children.Count == 0)
	{
		// row-hierarchy has no children => process columns, either static or dynamic
		if (rowMember.IsStatic)
		{
			foreach (TablixMember mC in tablix.ColumnHierarchy.MemberCollection)
			{
				DebugColumnHierarchyForRow(tablix, rowMember, rowDepth, mC, 0, dataRowCounterPrefix);
			}

		}
		else
		{
			int dataRowCounter = 0;
			TablixDynamicMemberInstance dynamicInstance = (TablixDynamicMemberInstance)rowMember.Instance;
			dynamicInstance.ResetContext();
			while (dynamicInstance.MoveNext())
			{
				foreach (TablixMember mC in tablix.ColumnHierarchy.MemberCollection)
				{
					dataRowCounter++;
					DebugColumnHierarchyForRow(tablix, rowMember, rowDepth, mC, 0, string.Format("{0}/{1}",dataRowCounterPrefix, dataRowCounter));
				}
			}
		}
	} else
	{
		// row-hierarchy has children => process children, either static or dynamic
		if (rowMember.IsStatic)
		{
			foreach (TablixMember c in rowMember.Children)
			{
				DebugRowHierarchy(tablix, c, rowDepth + 1, dataRowCounterPrefix);
			}
		}
		else
		{
			int dataRowCounter = 0;
			TablixDynamicMemberInstance dynamicInstance = (TablixDynamicMemberInstance)rowMember.Instance;
			dynamicInstance.ResetContext();
			while (dynamicInstance.MoveNext())
			{
				foreach (TablixMember c in rowMember.Children)
				{
					dataRowCounter++;
					DebugRowHierarchy(tablix, c, rowDepth + 1, string.Format("{0}/{1}", dataRowCounterPrefix, dataRowCounter));
				}
			}
		}
	}
}

private static void DebugColumnHierarchyForRow(Tablix tablix, TablixMember rowMember, int rowDepth, TablixMember colMember, int colDepth, string dataRowCounterPrefix)
{
	if (colMember.Children == null || colMember.Children.Count == 0)
	{
		// col-hierarchy has no children => process values, either static or dynamic
		if (colMember.IsStatic)
		{
			foreach (TablixMember mC in tablix.ColumnHierarchy.MemberCollection)
			{
				Debug.WriteLine(string.Format("{0} | {1} | {2} | {3} | {4}: {5}",
					rowMember.MemberCellIndex, colMember.MemberCellIndex,
					rowDepth, colDepth,
					dataRowCounterPrefix,
					GetValue(tablix, rowMember.MemberCellIndex, colMember.MemberCellIndex)
					)
					);
			}

		}
		else
		{
			StringBuilder str = new StringBuilder();
			str.AppendFormat("{0} | {1} | {2} | {3} | {4}:",
					rowMember.MemberCellIndex, colMember.MemberCellIndex,
					rowDepth, colDepth,
					dataRowCounterPrefix
				);
			TablixDynamicMemberInstance dynamicInstance = (TablixDynamicMemberInstance)colMember.Instance;
			bool addSeparator = false;
			dynamicInstance.ResetContext();
			while (dynamicInstance.MoveNext())
			{
				// add separator, except before first value
				if (addSeparator)
				{
					str.Append(", ");
				}
				else
				{
					addSeparator = true;
				}
				// add value
				str.Append(GetValue(tablix, rowMember.MemberCellIndex, colMember.MemberCellIndex));
			}
			Debug.WriteLine(str);
		}
	}
	else
	{
		// column-hierarchy has children => process children, either static or dynamic
		if (colMember.IsStatic)
		{
			foreach (TablixMember c in colMember.Children)
			{
				DebugColumnHierarchyForRow(tablix, rowMember, rowDepth, c, colDepth + 1, dataRowCounterPrefix);
			}
		}
		else
		{
			int dataRowCounter = 0;
			TablixDynamicMemberInstance dynamicInstance = (TablixDynamicMemberInstance)colMember.Instance;
			dynamicInstance.ResetContext();
			while (dynamicInstance.MoveNext())
			{
				dataRowCounter++;
				foreach (TablixMember c in rowMember.Children)
				{
					DebugColumnHierarchyForRow(tablix, rowMember, rowDepth, c, colDepth + 1, string.Format("{0}/{1}", dataRowCounterPrefix, dataRowCounter));
				}
			}
		}
	}
}

public static string CreateRowCsv(Tablix tablix, TablixMember row, string separator)
{
	StringBuilder b = new StringBuilder();
	int count = tablix.Body.ColumnCollection.Count;
	int i = 0;
	while (i < count)
	{
		CellContents content = tablix.Body.RowCollection[row.MemberCellIndex][i].CellContents;
		b.Append(GetValue(content));

		// adjust column counter
		i += content.ColSpan;

		// add separator, if not last column
		if (i < count)
		{
			b.Append(separator);
		}
	}
	return b.ToString();
}

public static string GetValue(Tablix tablix, int rowHierarchy, int columnHierarchy)
{
	TablixCell tablixCell = tablix.Body.RowCollection[rowHierarchy][columnHierarchy];
	if (tablixCell != null) {
		return GetValue(tablixCell.CellContents);
	}
	return null;
}

public static string GetValue(CellContents content)
{
	// add content, if textbox - TODO: support other cell content
	if (content != null && content.ReportItem != null)
	{
		TextBoxInstance textBoxInstance = content.ReportItem.Instance as TextBoxInstance;
		if (textBoxInstance != null)
		{
			return textBoxInstance.Value;
		}
		else if (content.ReportItem != null)
		{
			return string.Format("UNSUPPORTED REPORT-ITEM {0}", content.ReportItem.GetType().Name);
		}
	}
	return null;
}

Chart

The visual elements of a chart are grouped into Chart Areas.
The data itself is split into 4 dimensions:

  • Series Hierarchy
  • Category Hierarchy
  • Series Collection => containing a Chart Area Name and a Chart Series Type (Bar, Column, Line etc.)
  • Data Points

Example

Chart Definition:

Chart Preview:

Diagnostics Output:

- Chart1: Type = Chart, Dataset = DSTableList
=== Chart Titles ===
 - 0: Caption = Table-Count by Schema
=== Areas ===
 - 0: Name = Default
 - 1: Name = Area1
=== Series Hierarchies ===
 - 0: Label = Table Name, isStatic = True
=== Category Hierarchies ===
 - 0: Label = =Fields!SchemaName.Value, isStatic = False
=== Series ===
 - 0: Name = TableName, Chart Area = , Chart Series Type = Bar
=== Derived Series ===
 - 1: Name = DerivedSeries1, Chart Area = Area1, Chart Series Type = Line, Formula = MovingAverage
=== Chart Details ===
- Series Hierarchy | Category Hierarchy | Series-Name | Series-Type: Values (x|y)
 - Table Name | Application | TableName | : (|15)
 - Table Name | dbo | TableName | : (|1)
 - Table Name | Purchasing | TableName | : (|7)
 - Table Name | Sales | TableName | : (|12)
 - Table Name | Warehouse | TableName | : (|14)

Code to get the Diagnostics Output:

public static void DebugItem(Chart chart)
{
 Debug.WriteLine(string.Format("- {0}: Type = {1}, Dataset = {2}",
   chart.Name,
   chart.GetType().Name,
   chart.DataSetName
   ));

 int i = 0;
 Debug.WriteLine("=== Chart Titles ===");
 foreach (ChartTitle t in chart.Titles)
 {
  Debug.WriteLine(string.Format(" - {0}: Caption = {1}", i++, ((ChartTitleInstance)t.Instance).Caption));
 }

 i = 0;
 Debug.WriteLine("=== Areas ===");
 foreach (ChartArea a in chart.ChartAreas)
 {
  Debug.WriteLine(string.Format(" - {0}: Name = {1}", i++, a.Name));
 }

 i = 0;
 Debug.WriteLine("=== Series Hierarchies ===");
 foreach (ChartMember m in chart.SeriesHierarchy.MemberCollection)
 {
  Debug.WriteLine(string.Format(" - {0}: Label = {1}, isStatic = {2}", i++, (m.Label.IsExpression ? m.Label.ExpressionString : m.Label.Value), m.IsStatic));
 }

 i = 0;
 Debug.WriteLine("=== Category Hierarchies ===");
 foreach (ChartMember m in chart.CategoryHierarchy.MemberCollection)
 {
  Debug.WriteLine(string.Format(" - {0}: Label = {1}, isStatic = {2}", i++, (m.Label.IsExpression ? m.Label.ExpressionString : m.Label.Value), m.IsStatic));
 }

 i = 0;
 Debug.WriteLine("=== Series ===");
 foreach (ChartSeries s in chart.ChartData.SeriesCollection)
 {
  Debug.WriteLine(string.Format(" - {0}: Name = {1}, Chart Area = {2}, Chart Series Type = {3}", i++, s.Name, s.ChartAreaName, s.Type.Value));
 }
 Debug.WriteLine("=== Derived Series ===");
 foreach (ChartDerivedSeries s in chart.ChartData.DerivedSeriesCollection)
 {
  Debug.WriteLine(string.Format(" - {0}: Name = {1}, Chart Area = {2}, Chart Series Type = {3}, Formula = {4}", i++, s.Series.Name, s.Series.Instance.ChartAreaName, s.Series.Type.Value, s.DerivedSeriesFormula));
 }

 Debug.WriteLine("=== Chart Details ===");
 Debug.WriteLine("- Series Hierarchy | Category Hierarchy | Series-Name | Series-Type: Values (x|y)");
 foreach (ChartMember m in chart.SeriesHierarchy.MemberCollection)
 {
  DebugChartSeriesHierarchy(chart, m);
 }
}

private static void DebugChartSeriesHierarchy(Chart chart, ChartMember seriesHierarchy)
{
 if (seriesHierarchy.IsStatic)
 {
  foreach (ChartMember cm in chart.CategoryHierarchy.MemberCollection)
  {
   DebugChartCategoryHierarchy(chart, seriesHierarchy, cm);
  }
 }
 else
 {
  ChartDynamicMemberInstance dI = (ChartDynamicMemberInstance)seriesHierarchy.Instance;
  dI.ResetContext();
  while (dI.MoveNext())
  {
   foreach (ChartMember cm in chart.CategoryHierarchy.MemberCollection)
   {
    DebugChartCategoryHierarchy(chart, seriesHierarchy, cm);
   }
  }
 }
}

private static void DebugChartCategoryHierarchy(Chart chart, ChartMember seriesHierarchy, ChartMember categoryHierarchy)
{
 if (categoryHierarchy.IsStatic)
 {
  foreach (ChartSeries s in chart.ChartData.SeriesCollection)
  {
   DebugChartSeries(chart, seriesHierarchy, categoryHierarchy, s);
  }
 }
 else
 {
  ChartDynamicMemberInstance dI2 = (ChartDynamicMemberInstance)categoryHierarchy.Instance;
  dI2.ResetContext();
  while (dI2.MoveNext())
  {
   foreach (ChartSeries s in chart.ChartData.SeriesCollection)
   {
    DebugChartSeries(chart, seriesHierarchy, categoryHierarchy, s);
   }
  }
 }
}

private static void DebugChartSeries(Chart chart, ChartMember seriesHierarchy, ChartMember categoryHierarchy, ChartSeries series)
{
 StringBuilder sDataPoint = new StringBuilder();
 sDataPoint.AppendFormat(" - {0} | {1} | {2} | {3}: ", seriesHierarchy.Instance.Label, categoryHierarchy.Instance.Label, series.Name, series.Instance.ChartAreaName);
 for (int iDataPoint = 0; iDataPoint < series.Count; iDataPoint++)
 {
  ChartDataPoint p = series[iDataPoint];
  ChartDataPointValues pV = p.DataPointValues;
  ChartDataPointValuesInstance pVI = pV.Instance;
  sDataPoint.AppendFormat("({0}|{1})", pVI.X, pVI.Y);
  if (iDataPoint < series.Count - 1)
  {
   sDataPoint.Append(", ");
  }
 }
 Debug.WriteLine(sDataPoint);
}

Conclusion

During a few days, I've got only a first glimpse of the possibilities with Custom Rendering Extension. This post therefore covers only a part of the overall picture - but hopefully is a good start, if you have to proceed with this journey.

Further Reading

 

Анонсирована первая сборка Windows Server в программе Insider Preview

$
0
0

В мае 2017 представителями Microsoft был анонсирован выход предварительных версий Windows Server в рамках программы Windows Insider.

И вот на днях выпущена первая публичная предварительная версия Windows Server сборки 16237. Подробный список новых возможностей приведён в блоге Windows. Загрузить предварительные сборки для участников Windows Insider и Windows Insider for Business можно со страницы программы.

次世代のインテリジェントなアプリとサービスを作る【7/17 更新】

$
0
0

(この記事は 2017 年 6 月 7 日にMicrosoft Partner Network blog に掲載された記事 Building the Next Generation of Intelligent Apps and Services の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

 

ソフトウェアは驚くような形で世界を変え続けており、開発者たちがその変化の中心を担っています。私にとっては、パートナー様がマイクロソフトの最新テクノロジを利用してお客様のために新しい製品やソリューションを開発しているようすを拝見するのが、いつも楽しみです。

 

開発者の皆様は今まさに、デジタル トランスフォーメーションの真っ只中にいます。そしてパートナー様は Microsoft Inspire (英語) などのイベントに参加することで、そうした画期的な最新テクノロジを活用して、最先端のインテリジェントなクラウド アプリや、シンプルなエンドツーエンドの DevOps エクスペリエンス、これまで想像もできなかったような AI を組み込んだアプリ (英語) など、さまざまなものを生み出す方法を学ぶことができます。

 

マイクロソフト パートナー様が成果を拡大中

実際に大きな成果を上げているマイクロソフト パートナー様として、Mirabeau (英語) をご紹介します。アムステルダムを拠点とする同社は、可能な限り優れたユーザー エクスペリエンスを実現するための戦略、デザイン、DevOps、インサイトを兼ね備えた企業です。適切なビジュアルを採用し、直観的なテクノロジを駆使し、将来を見据えた高い適応性を身に付けることで、デジタル トランスフォーメーションを推し進めています。私は Mirabeau がどのようにマイクロソフトとパートナーシップを結び、Microsoft Cognitive Services を活用してオランダのトランサヴィア航空のチャットボットを開発したかについて、同社のクライアント マネージャーを務める Peter Bakker 氏からお話を伺いました。

 

 

業界で独自のポジションを確立するには

Mirabeau にとっては優れたアイデアと実行力こそがすべてだと Bakker 氏は話します。「すばらしいアイデアを完璧に実行すること、専門知識を活かすこと、エンド ユーザーに寄り添った視点から世界を見ることを大切にしています。お客様に代わって最高のアイデアからソリューションを開発し、最短期間で市場に投入しているのです。Cognizant Digital Business の一員になったことで、サービスの幅が広がり、グローバルに事業を拡大できました」

 

お客様を知り、その課題に最適なソリューションを提供するには

Mirabeau のお客様は世界各国の大手企業です。競争力を維持するためには、お客様に寄り添う必要があります。刻々と変化するデジタルの世界で競合企業よりも常に一歩先を行こうとするなら、最新のテクノロジが欠かせません。そこで Mirabeau は、さまざまな専門分野にわたる「アジャイル」というチームを編成し、テクノロジの進歩とニーズの変化にすばやく対応できるように取り組んでいます。

トランサヴィア航空はエールフランス‐KLM グループの独立企業で、ヨーロッパでも有数の格安航空会社として知られています。トランサヴィア航空は顧客向けにチャット アプリを提供していましたが、顧客の使用パターンに大きな変化が生じていました。そんなとき休暇旅行に出ていた 1 人の幹部が、このアプリをベースとして、単なるヒューマン インターフェイスにとどまらない、新しいサービスを作り出す必要があると思い立ちました。こうしてチャットボットが誕生したのです。

 

ソリューションや製品を生み出すきっかけとは

Bakker 氏は次のように語っています。「デジタル メディアは非常に速いペースで開発されており、衰える兆しはありません。チャットは一般的なコミュニケーション手段 (英語) として、モバイル ユーザーの間に急速に浸透しています。会話型インターフェイスによって、顧客は WhatsApp や Facebook Messenger といったメッセンジャー プラットフォーム上から企業と直接やり取りすることができます。有名なシリコンバレーのベンチャーキャピタルである KPCB は、これまでの常識をくつがえすような進歩になるだろうと予測しています」

 

「Mirabeau の提案を初めて聞いたとき、すぐに気に入りました。チャットボットが次の大きな波を起こすことは私たちの目にも明らかでしたが、その流れに乗じてトランサヴィアのお客様に真の付加価値を提供するための方法を教えてくれたのです」

- トランサヴィア航空、イノベーション リード、Ines Verburgh 氏

 

変わりゆく産業界の今、そして未来

企業は顧客に寄り添わなければならない、Bakker 氏はそう考えます。最新テクノロジを駆使しつつ、個人と向き合うことが不可欠です。顧客 1 人ひとりと有意義な関係を築き、継続的に対話を重ねていく必要があります。そして何よりも、企業は顧客に約束したことを果たなければなりません。

「将来、デジタルは完全に個々のユーザーに寄り添ったものになるでしょう。そこで生き残れるのは、顧客についてきちんと理解し、そのニーズに応えられる企業だけです。Mirabeau はそうした企業を目指すことで業績を伸ばしてきました。当社は Web サイトやモバイル、ボット、機械学習、IoT、そしてオフライン チャネルなど、どのチャネルを経由するカスタマー ジャーニーであっても、隅々まで安全性を確保しています」(Bakker 氏)

 

次にアプローチを変えるとしたら

Bakker 氏によると、Mirabeau が Microsoft Cognitive Services の Language Understanding Intelligent Service を導入した当初、同サービスではまだオランダ語の追加サポートに向けて取り組んでいるところだったため、まず英語での会話に活用することにしました。「Mirabeau とトランサヴィア航空に寄せられたユーザーのご意見から、オランダ語での会話に対応することの重要性を強く認識しました。ちょうど最近 Language Understanding (Intelligent Service) でオランダ語がサポートされたので、トランサヴィア航空のチャットボットはオランダ語でもスムーズにやり取りできるようになりました。現在はさらに他のお客様向けにもチャットボットの開発を進めており、数か月後には運用が開始される予定です」(Bakker 氏)

 

このように Mirabeau では、次世代のインテリジェント アプリを開発したことで、カスタマー エクスペリエンスに革命をもたらしており、大いに刺激をもらえる事例だと思います。今後も、Microsoft Azure を活用してテクノロジとビジネスの未来を形作っていくパートナー様の事例をご紹介していく予定です。次回をお楽しみに。

 

皆様はどのようにマイクロソフトとパートナーシップを結び、お客様のための新しいテクノロジを開発していますか。また、今後はマイクロソフトとどのようなパートナーシップを築いていきたいですか。ぜひご意見をお寄せください。

 

 

New Office 365 Desktop Application Deployment Capabilities In Microsoft Intune

$
0
0

Over the last couple of months we've seen different ways of deploying Microsoft Office 365 Pro Plus to Windows 10 PCs, first up we saw the option inside of Intune for Education, and then last week at Inspire we saw how this works from the Office 365 portal in Microsoft 365 Business. For those who aren't using either of those options, but still want an incredibly easy way to deploy Office 365 Pro Plus and Office 365 Business.

First of all, in the Azure Portal, open the Intune blade and click on Mobile apps.

Once in Mobile apps, you can see more options for app management, and we start by clicking on the Apps link.

I've already synchronised this account with Windows Store for Business, so you can see some of the default apps that it adds to the app list. From here we click Add.

 

From the drop down we choose Office 365 Pro Plus Suite (Windows 10), which as you will see also works for those of you with Office 365 Business or Office 365 Business Plans.

Starting with Configure App Suite, it's not very likely that you will need to select all of the options, but in this case I've just done it to make sure I get all of the application icons exposed.

Next up is App Suite Information, all we really need to do here is populate the Suite Name and Suite Description fields, the others are either pre-populated or not required.

App Suite Settings gives the chance to select 32 or 64 bit, Update Channel, EULA and shared computer activation, along with additional languages.

Once configured we Add the app.

The blue notification bar advises that we need to Assign application to at least one user group, 'Click Assignments'.

I've already got a group set up for deployment, so I can Select that group.

Next I will make the app Required to kick start the installation process on the client.

 

Switching over the Intune MDM enrolled Windows 10 client, you can see that the Click-To-Run installer pieces are running, as displayed in Task Manager.

As this is going to be over 1GB in size, it might take a while to download, depending on your connection speed.

Once the installation is complete you we can see the Office Pro Plus applications on the Start menu.

Here's where the Pro Plus versus Business conversation starts. As you can see, the expected version of Office is installed. However, it hasn't automatically activated because this user doesn't have Pro Plus/E3/E5 license assigned, they only have Office 365 Business.

If I close Word and then reopen it, you can see that the edition has changed from Pro Plus to Business, no additional configuration required.

And finally, once Office has been reconfigured itself as office 365 Business, everything is ready to go.

Understanding Safety Net

$
0
0

Lets keep it simple: "Exchange safety net takes over where the shadow redundancy has left." Once the message is deivered to the mailbox AND before is is successfully  replicated to the passive mailbox database copies- it is stored in a queue that is associated with the transport service on mailbox server. This queue is called Safety net and is an advancement of Transport Dumpster of exchange 2010. By default the safety net holds a copy of a successfully delivered message for 2 days.

 

Safety net does not require a DAG, like shadow redundancy; safety net stores the successfully delivered message to other mailbox server in same active directory if DAG is not a part.

 

Safety net itself is not a single point of failure. if the primary shadow redundancy fails; the shadow copy will take over and the messages are redelivered from the shadow subnet. you can configure thus behavior by changing this commandlet: set-transportConfig.

 

How does it work?

Shadow redundancy keeps a redundant copy of the message while the message is in transit. Safety Net keeps a redundant copy of a message after the message is successfully processed. So, Safety Net begins where shadow redundancy ends. concepts in shadow redundancy, including the transport high availability boundary, primary messages, primary servers, shadow messages and shadow servers also apply to Safety Net.

The Primary Safety Net exists on the Mailbox server that held the primary message before the message was successfully processed by the Transport service. This could mean the message was delivered to the Mailbox Transport Delivery service on the destination Mailbox server. Or, the message could have been relayed through the Mailbox server in an Active Directory site that's designated as a hub site on the way to the destination DAG or Active Directory site. After the primary server processes the primary message, the message is moved from the active delivery queue into the Primary Safety Net on the same server.

The Shadow Safety Net exists on the Mailbox server that held the shadow message. After the shadow server determines the primary server has successfully processed the primary message, the shadow server moves the shadow message from the shadow queue into the Shadow Safety Net on the same server. Although it may seem obvious, the existence of the Shadow Safety Net requires shadow redundancy to be enabled (it's is enabled by default).


Office 365 Weekly Digest | July 9 - 15, 2017

$
0
0

Welcome to the July 9 - 15, 2017 edition of the Office 365 Weekly Digest. Please see this page for details on a change regarding the inclusion of Message Center notification details in these weekly posts.

A new addition to the upcoming events is an opportunity for educators to attend a Microsoft Innovative Educator (MIE) Academy session in various locations throughout the United States. The first session is on July 21st in San Antonio, TX, followed by a second session in San Diego, CA on July 27th.

The unveiling of Microsoft 365 at Inspire conference was last week's top highlight. Microsoft 365 brings together Office 365, Windows 10 and Enterprise Mobility + Security to deliver a complete, intelligent and secure solution to empower employees. Three new apps were also announced, providing even more value to Office 365 Business Premium customers. This week's digest also has details on new intelligent search capabilities in OneDrive for Business, updates to the Outlook app on Android and iOS, as well as several new features in Microsoft Flow.

The monthly video update for July from Jim Naroski, a huge collection of free Microsoft eBooks, several Microsoft IT Showcase resources including four Skype for Business studies and a video on managing and governing SharePoint in Office 365 wrap up this week's digest.

 


UPCOMING EVENTS

 

Azure Active Directory Webinars for July

When: Multiple sessions currently scheduled from July 11 – July 19, 2017 | Sessions include Azure AD Connect Health, Getting Ready for Azure AD, Securing Your Identities with Multi-Factor Authentication (MFA), Accessing Your Organization’s Internal Applications via Azure AD App Proxy and more. Each 1-hour or 75-minute webinar is designed to support IT Pros in quickly rolling out Azure Active Directory features to their organization. All webinars are free of cost and will include an anonymous Q&A session with our Engineering Team. So, come with your questions!  Capacity is limited. Sign up for one or all of the sessions today!  Note: There are also some sessions available on-demand.

 

SharePoint: Inform and engage your employees

When: Wednesday, July 19, 2017 | On-demand webcast—Farren Roper and Mark Kashman, from the SharePoint team, present the “SharePoint: Inform and engage your employees” business webcast. Sign up today to receive details via email. And in advance, read the new, related eBook, “4 secrets to a connected workplace.”

 

Free Microsoft Innovative Educator (MIE) Teacher Academy

When: July 21, 2017 through September 29, 2017 at various US locations | Microsoft Innovative Educator (MIE) Trainers are leading fun, professional development sessions this summer and they’re coming to a city near you. Join us for these BYOD workshops showcasing Microsoft’s hottest tools and resources for K-12 teachers, built to empower students to achieve more. Attendees will get to explore tools such as Microsoft Teams, Office Online, OneNote Class Notebooks, Microsoft Forms and Sway, and learn how technology can provide their students with learning experiences beyond the four walls of their classroom, thanks to Skype in the Classroom and the Microsoft Educator Community. Register today and join us at an upcoming Teacher Academy near you!

 


BLOG ROUNDUP

 

Introducing Microsoft 365

Last week at Inspire, Satya Nadella unveiled Microsoft 365, which brings together Office 365, Windows 10 and Enterprise Mobility + Security, delivering a complete, intelligent and secure solution to empower employees. It represents a fundamental shift in how we will design, build and go to market to address our customers’ needs for a modern workplace. To address the commercial needs from the largest enterprise to the smallest business, we are introducing Microsoft 365 Enterprise and Microsoft 365 Business. Microsoft 365 Enterprise is designed for large organizations and integrates Office 365 Enterprise, Windows 10 Enterprise and Enterprise Mobility + Security to empower employees to be creative and work together, securely. Microsoft 365 Business is designed for small- to medium-sized businesses with up to 300 users and integrates Office 365 Business Premium with tailored security and management features from Windows 10 and Enterprise Mobility + Security. It offers services to empower employees, safeguard the business and simplify IT management.

 

New business apps in Office 365 Business Premium help you run and grow your small business

To make Office 365 more valuable for your small business, we are announcing three new applications coming to Office 365 Business Premium: (1) Microsoft Connections—A simple-to-use email marketing service, (2) Microsoft Listings—An easy way to publish your business information on top sites, and (3) Microsoft Invoicing—A new way to create professional invoices and get paid fast. We’re also introducing the Office 365 Business center, a central place where you can manage these business apps and get an end-to-end view of your business. In addition, we’re adding MileIQ, the leading mileage tracking app, as an Office 365 Business Premium subscription benefit. These new services—along with the recently added Microsoft Bookings and Outlook Customer Manager—help you win customers and manage your business. Connections, Listings, Invoicing and the Business center are rolling out in preview over the next few weeks to Office 365 Business Premium subscribers in the U.S., U.K. and Canada, starting with those in the First Release program. MileIQ Premium is now available to all Business Premium subscribers in the U.S, U.K. and Canada.

 

Digitize your workplace one photo at a time with intelligent search in OneDrive for Business

A picture is worth a thousand words, and now your images uploaded to OneDrive can be just that with our new intelligent detection and search functionality. Taking notes is much harder than snapping a photo with your phone. It's a great way to document a meeting or site visit; the whiteboard image tells the story of shared construction and development. With that in mind, the product team have been hard at work on a feature which we hope helps you every day, to work smarter, faster and with better recollection. We're excited to share some of the steps on the journey with you. We are starting small at first, but have a goal to grow the capability based on your feedback and needs. Releasing this month (July 2017) is the ability to search for photos using the objects that are in them. When you upload an image into OneDrive, whether a snap of a whiteboard, a receipt, a screen shot, a vector graphic, line drawing or even x-ray film, OneDrive will automatically detect it, and make it available in search, without you having to do anything other than upload the image.

 

Redesigned navigation, conversations and search in Outlook for iOS and Android

We’re launching exciting new changes to Outlook on iOS and Android. It’s the Outlook you know and love, with a redesigned conversation experience and the ability to quickly switch between accounts and browse folders. In addition, new intelligent search capabilities, powered by Microsoft Graph, are coming soon. The new accounts navigation and conversations experiences are available now in Outlook for iOS with Outlook.com and Gmail accounts. Support for Office 365 accounts will be rolling out in the coming days. The new search experience is coming soon to Outlook for iOS. We’re working hard right now to bring all three of these new experiences to Outlook for Android over the next couple of months.

 

Easily publish your flows to the public gallery, improved custom connector experience and more!

It's now easy to publish any flows you create to the public gallery. Simply select the More... menu when you are looking at a flow and then choose the Submit to Gallery option. You can choose the set of categories for the template that will be in the gallery. We have also released a new action for the Office 365 Outlook and Outlook.com connectors called Get calendar view of events. This action takes a start time and end time and inputs, and returns a list of the events, including instances of recurring events, that happen during that time window. There is a new Run now option on the mobile app when you select a flow that has Recurrence as the trigger. This means you can quickly kick off flows that normally run on a schedule. There are several new capabilities in the Custom Connector experience. First, you can create dynamic dropdowns. Dynamic dropdowns are used when the set of items in a list can change depending on the user. The second new feature is you can add polling triggers. The third new feature for custom connectors is there is a rich experience for testing.

 


NOTEWORTHY

 

Video: Office 365 Update for July 2017

Format: Video (11 minutes) | Jim Naroski covers recent enhancements to Office 365, including the Planner mobile app, Outlook for Android, iOS and Mac, Microsoft Forms, Microsoft Stream, Security and Visio + Power BI. The course transcript, complete with links to additional information on everything covered, is available at http://aka.ms/o365update-transcripts.

 

Largest FREE Microsoft eBook Giveaway including: Office 365, Office 2016, Power BI, and many more!

An annual post with a list of over 350 eBooks covering a wide array of subjects, including Office 365, Office 2016, Power BI, PowerShell and many, many more! Most of the eBooks are available in multiple formats such as PDF, EPUB, DOC and MOBI. Also included is a PowerShell script to download all of the eBooks so you don’t have to do so individually.

 

SharePoint Online: Coming soon - new capabilities from Site Information panel (change site icon and delete site)

We wanted to share a couple of new features that will be enabled on the Site information panel (accessible from the gear menu) that site owners will soon be able to take advantage of: (1) Change the site icon: For group connected sites this will update the icon for the group and reflect in other group workloads, and (2) Delete site: This will allow site owners to delete site. For group connected sites, this will trigger the deletion of the group as well.  Owners will be prompted with a confirmation dialog requiring an explicit acknowledgement via checkbox that content will be deleted before proceeding. We plan to start rolling this feature out in the next couple of weeks, starting with First Release.

 

Microsoft IT Showcase: Sharing content securely - Managing and governing SharePoint and Office 365 at Microsoft

Format: Video (52 minutes) | Published: July 13, 2017 | In business, employee collaboration with colleagues inside and outside of the company is the name of the game. Finding a balance between readily sharing files and protecting corporate assets is crucial. At Microsoft IT, we get it. With over 400,000 SharePoint team sites, intranet portals, and OneDrive for Business sites, we have a lot to manage. Learn how we use custom solutions, organizational policy, Office 365, and Azure Active Directory to help manage, govern, and protect our data in a highly collaborative environment.

 

Microsoft IT Showcase: Cutting costs in the cloud with Skype for Business

Format: Business Case Study | Published: July 14, 2017 | Migrating enterprise unified communications to cloud-based Skype for Business services at Microsoft proved to be an ambitious undertaking for Microsoft IT, but we reaped the rewards. As business groups adopted the services, they significantly reduced operating expenses, capital expenses, legacy third-party carrier fees, and travel. Our journey to the cloud has also led to increased productivity—our people can communicate and collaborate from any location on any device. | Related: How cloud-based PBX and PSTN save Microsoft IT more than $120,000 per day with Skype for Business

 

Microsoft IT Showcase: Deploying Skype for Business in the cloud

Format: Technical White Paper | Published: July 14, 2017 | With the hybrid cloud architecture of Office 365 Enterprise E5 as our foundation, Microsoft IT migrated our global workforce to Skype for Business in two stages. First, we gave some people limited cloud-based conferencing, and then we eventually moved whole user groups to complete cloud-based unified communications. With careful planning, we helped teams maintain productivity during deployment and upgraded our network infrastructure to support capacity requirements. | Related: How Microsoft IT planned and deployed Skype for Business to the Office 365 Enterprise E5 cloud

 

Microsoft PREMCast: Azure Cosmos DB - global verteilter Datenbankdienst mit Unterstützung mehrerer Datenmodelle

$
0
0

Beschreibung
Azure Cosmos DB wurde vom Grund auf neu entwickelt mit der Idee, globale Verteilung und horizontale Skalierung als Cloud-basierten Service anzubieten. Die transparente Skalierung und Replikation stellen Ihre Daten dort bereit, wo Ihre Kunden sind und bietet hohe Verfügbarkeit und sehr kurze Latenzzeiten. Als einzige Datenbank heute, unterstützt Cosmos DB mehrere NoSQL-Datenmodelle, sei es Key-Value, Graph oder Dokumentdaten. Dieser PREMCast bietet eine Einführung in diese neueste Datenbank von Microsoft.

Zielgruppe
Dieser PREMCast richtet sich an Entwickler, IT-Architekten und IT-Manager.

Level 200
(Level Skala: 100= Strategisch/ 200= technischer Überblick/ 300=tiefe Fachkenntnisse/  400= technisches Expertenwissen)

Sprache
Dieser PremCast wird in deutscher Sprache gehalten. Es werden hauptsächlich Englisch sprachige Kursunterlagen verwendet.

Anmeldung
Zur Anmeldung wenden Sie sich bitte direkt an Ihren Microsoft Technical Account Manager oder senden Sie eine Mail an peger@microsoft.com. Besuchen Sie uns auf Microsoft Premier Education.

First release candidate of SQL Server 2017 now available

$
0
0

We are pleased to announce the first public release candidate for SQL Server 2017, Release Candidate 1 (RC1), which will be made available later today. This means that development work for the new version of SQL Server is complete along most dimensions needed to bring the industry-leading performance and security of SQL Server to Windows, Linux, and Docker containers.

In our seven community technology previews (CTPs) to date, SQL Server 2017 has delivered:

  • Linux support for tier-1, mission-critical workloads SQL Server 2017 support for Linux includes the same high availability solutions on Linux as Windows Server, including Always On availability groups integrated with Linux native clustering solutions like Pacemaker.
  • Graph data processing in SQL Server With the graph data features available in SQL Server 2017 and Azure SQL Database, customers can create nodes and edges, and discover complex and many-to-many relationships.
  • Adaptive query processing Adaptive query processing is a family of features in SQL Server 2017 that automatically keeps database queries running as efficiently as possible without requiring additional tuning from database administrators. In addition to the capability to adjust batch mode memory grants, the feature set includes batch mode adaptive joins and interleaved execution capabilities.
  • Python integration for advanced analytics Microsoft Machine Learning Services now brings you the ability to run in-database analytics using Python or R in a parallelized and scalable way. The ability to run advanced analytics in your operational store without ETL means faster time to insights for customers while easy deployment and rich extensibility make it fast to get up and running on the right model.

Key enhancements in Release Candidate 1

In SQL Server 2017 RC1, there were several feature enhancements of note:

  • SQL Server on Linux Active Directory integration – With RC1, SQL Server on Linux supports Active Directory Authentication, which enables domain-joined clients on either Windows or Linux to authenticate to SQL Server using their domain credentials and the Kerberos protocol.
  • Transport Layer Security (TLS) to encrypt data – SQL Server on Linux can use TLS to encrypt data that is transmitted across a network between a client application and an instance of SQL Server. SQL Server on Linux supports the following TLS protocols: TLS 1.2, 1.1, and 1.0.
  • Machine Learning Services enhancements – In RC1, we add more model management capabilities for R Services on Windows Server, including External Library Management. The new release also supports Native Scoring.
  • SQL Server Analysis Services (SSAS) In addition to the enhancements to SSAS from previous CTPs of SQL Server 2017, RC1 adds additional Dynamic Management Views, enabling dependency analysis and reporting. See the Analysis Services blog for more information.
  • SQL Server Integration Services (SSIS) on Linux The preview of SQL Server Integration Services on Linux now adds support for any Unicode ODBC driver, if it follows ODBC specifications. (ANSI ODBC driver is not supported.)
  • SQL Server Integration Services (SSIS) on Windows Server RC1 adds support for SSIS scale out in highly available environments. Customers can now enable Always On for SSIS, setting up Windows Server failover clustering for the scale out master.

SQL Server 2017 for faster performance

SQL Server 2017 has several new benchmarks demonstrating faster performance than competitive databases, and against older versions of SQL Server:

Streamline your DevOps using SQL Server 2017

In SQL Server 2017, we have introduced support for SQL Server on Linux-based containers, a benefit for customers using containers in development or production. We’re also working to help developers get started developing an app for SQL Server as fast as possible with installation instructions, code snippets, and other handy information.

On our new microsite DevOps using SQL Server, which launched today, developers and development managers can learn how to integrate SQL Server in their DevOps tasks. Find demos, documentation, and blogs, as well as videos and conference presentations. Or, join the DevOps conversation at our Gitter channels.

Customers are already benefitting from SQL Server 2017

In fact, with our Early Adoption Program, customers can develop new applications for SQL Server 2017 or add Linux support to existing applications, and get the support and end-user license agreement that they need to go into production on SQL Server right now. Here are some customers already benefitting from SQL Server 2017 on Linux:

  • Convergent Computing A system integrator and longtime Microsoft partner, Convergent Computing was able to achieve a much faster return on server and storage hardware investments than usual by moving some tier-2 applications to inexpensive, white box servers running SQL Server 2017 on Linux.
  • dv01 – Financial technology startup dv01 started out with an open source database on a competitor cloud. But when it ran into performance and scale problems, SQL Server was able to give it 15X faster performance, plus in-database advanced analytics. And by moving to SQL Server 2017, dv01 could standardize its operating systems on Linux—all with an easy migration.

Get started with SQL Server 2017 RC1 today!

Try the release candidate of the SQL Server 2017 today! Get started with our updated developer tutorials that show you how to install and use SQL Server 2017 on macOS, Docker, Windows, and Linux and quickly build an app in a programming language of your choice. For more ways to get started, try the following:

Have questions? Join the discussion of SQL Server 2017 at MSDN. If you run into an issue or would like to make a suggestion, you can let us know through Connect. We look forward to hearing from you!

SQL Server 2017 containers for DevOps scenarios

$
0
0

This post was authored by Tony Petrossian, Partner Group Program Manager, Database Systems Group

SQL Server 2017 will bring with it support for the Linux OS and containers running on Windows, Linux, and macOS. Our goal is to enable SQL Server to run in modern IT infrastructure in any public or private cloud.

With support for containers, SQL Server can now be used in many popular DevOps scenarios.  Developers working with Continuous Integration/Continuous Deployment (CI/CD) pipelines can now include SQL Server 2017 containers as a component of their applications for an integrated build, test, and deploy experience.

CI/CD automation with containers – Using containers greatly simplifies the development, testing, and deployment of applications. This is achieved by the packaging of all dependencies, including SQL Server, into a portable, executable environment that reduces variability and increases the speed of every iteration in the CI/CD pipeline. This also enforces a consistent experience for all participants since they can share the same state of an application in their containers. Developers can improve applications in their local environments during the first part of the Continuous Integration process.

The development process starts by taking a container that represents the current state of a production application, including a subset of the sanitized data. Developers can then add their features and fixes to it, while having the ability to verify the functionality of the application at any moment. Then, the container can be sent to a testing/quality assurance environment where it can be tested with a larger, more representative dataset.

Continuous Deployment is a critical part of DevOps pipelines. With a successful CD pipeline, a validated and self-contained version of the application is available. Developers can publish and share fully configured containers with all application dependencies, including SQL Server, with their peers. This can significantly improve developers’ ability to collaborate as they can all work on the same exact configurations simultaneously without having to build the complicated environment necessary for developing and testing applications with many components.

Parallel testing made fast and easy – Developers can automate the large-scale testing of containerized applications that include SQL Server. Thousands of tests can execute in parallel using high-density container deployments with managed container services. Kubernetes, Docker Swarm, or other orchestration systems can be used to easily manage a large number of test executions. Long-running test cycles can be optimized by load balancing the executions across multiple pods that spin up on demand and spin down when finished.

The Microsoft SQL Server development team is now taking advantage of these capabilities in building, testing, and publishing the new versions of SQL Server 2017. The team uses Azure Container Services to deploy hundreds of containers managed by a large Kubernetes cluster to execute all daily tests of SQL Server. Hundreds of thousands of tests are executed within hours of the availability of a new build! This methodology has enabled the team to run more tests in less time with fewer resources.

Multi-OS development, test, and production environments – With the containerization of the app, developers no longer need to be concerned about aligning stages of the development and production pipeline with the same exact distribution and version of Linux. Developers can containerize their application environment, including SQL Server, to abstract it from the operating system of the underlying host. Whether part of the pipeline is operated on Ubuntu and other parts in Red Hat Enterprise Linux (RHEL), the ability to containerize the entire application environment eliminates the need to overcome challenges of a cross-platform environment. Developers are also free to choose their preferred development environment without worrying about compatibility issues in later parts of the pipeline. With SQL Server 2017, developers can run SQL Server Linux Docker images on macOS, Windows, and Linux.

Deploying SQL Server into production – As new versions of the application are tested and verified, builds of the containers are published for use in staging and production. The exact version of the SQL Server that was used throughout the development and testing pipeline is now in the production image, and the team can be confident that the entire stack, including SQL Server, has been tested as one unit and is ready for use.

Learn more about how to better integrate your SQL Server data/database on the DevOps cycle.

AutoSPInstaller Has Been Migrated to GitHub!

$
0
0

With the announcement a few months ago that CodePlex, the home of AutoSPInstaller and plenty of other open-source projects, was going to shut down roughly by the end of 2017, I knew I'd have to take action to keep my popular AutoSPInstaller automated SharePoint 201x installation project accessible. So, after a successful migration attempt for my other popular (but much smaller) CodePlex project AutoSPSourceBuilder, I decided to give AutoSPInstaller migration a try. Luckily, the migration process has gotten much smoother, and happened much more quickly and with less effort than I'd anticipated. Mind you only the source code and version history has been migrated - no Discussions, Issues or Releases will be moved over - so we'll be starting from scratch in a sense for those.

Anyhow without further delay, here's the location of the new home of AutoSPInstaller:

https://github.com/brianlala/AutoSPInstaller

According to Brian Harry's blog post, the other items on the old CodePlex project (Discussions, Items etc.) should remain available until December 15th 2017 for reference & support purposes before the content gets archived.

Finally, I'd be remiss not mention AutoSPInstaller's successor in this space, SharePointDSC - check it out!

Cheers
Brian

 

 

Tip of the Day: Azure Reverse DNS

SharePoint 2013 & 2016 - Manager and Assistant values swapped in User Profiles

$
0
0
Here’s one that was a problem in SharePoint 2013, was fixed, but never ported to SharePoint 2016, so we had to fix it again.

Consider the following scenario:

You are importing user profiles from Active Directory (AD).  This can happen using any of the profile import methods for either SharePoint 2013 or 2016.
  • 2013:
    • SharePoint Profile Synchronization (FIM Sync)
    • SharePoint Active Directory Import (AD Import)
  • 2016:
    • SharePoint Active Directory Import (AD Import)
    • External Identity Manager (MIM Sync)
You create profile property mappings to import both Manager and Assistant.
You run the Sync / Import and afterwards you find that some users show the incorrect values for manager and assistant.  “Incorrect” in this case can mean any of the following:
  • The values for Manager and Assistant are swapped.
  • Manager is blank and Assistant contains the managers name.
  • Assistant is blank and Manager contains the assistants name.
For example, in AD, user “Josh” has these values:
Manager = Jim
Assistant = Adam
After the Sync, Josh may view his profile and find that it says his manager is Adam and his assistant is Jim.

Cause:

In both SharePoint 2013 and 2016, this issue has the same root cause, which was a problem with one of the SQL stored procedures used to create the relationships between user, manager and assistant.
For SharePoint 2013, it was fixed in build 15.0.4659.1000 = ServicePack1 + October of 2014 cumulative update (CU).
For SharePoint 2016, this was fixed in build 16.0.4561.1000 = July 2017 public update (PU).

Resolution:

Upgrade your farm.
For SharePoint 2013, you can install any build at or newer than October 2014 CU.  However, since patching a SharePoint farm is not a trivial matter, I would recommend going much newer, like July of 2017 CU:  https://support.microsoft.com/en-us/help/3213569/
For SharePoint 2016, this has been fixed in what is currently the latest build, so that’s your only option:
https://support.microsoft.com/en-us/help/3213544 
https://support.microsoft.com/en-us/help/3213543

More Info:

This issue is called out in the public KB article for the July 2017 update for SharePoint 2016:
https://support.microsoft.com/en-us/help/3213544/
After you use the SharePoint Active Directory import option (AD import) for a User Profile service application, the Assistant property's value is incorrectly set to the Manager property's value.
However, it doesn’t specify that it can also happen when using an external identity manager (like MIM 2016), and it only mentions one of the possible behaviors.
I have tested this for both AD Import and MIM Sync, and have experienced all of the behaviors I described above.  Which one you hit just depends on which value (manager or assistant) is processed first.
Note: In SharePoint 2016, when using external identity manager (like MIM 2016), the manager, assistant, and group membership references are processed after the sync / import by a timer job:
"<UPAName> - Updates Profile Memberships and Relationships Job"
(internal name: ExternalIdentityManagerMembershipsAndRelationshipsJob)
After you run an import / sync, you won't see the manager or assistant values updated until that timer job runs.
You can use a SQL query like this against your Profile database to quickly see your Manager and Assistant values and see if they line up:
select upf.recordid, upf.ntname, upf.preferredName, pl.propertyname, upv.PropertyID, upv.PropertyVal, upv.SecondaryVal, upv.text
from upa.UserProfile_Full upf (nolock)
join upa.UserProfileValue upv (nolock)on upf.RecordID = upv.RecordID
join upa.PropertyList pl (nolock) on pl.PropertyID = upv.PropertyID
where upv.propertyid in (6, 21)
order by upf.ntname

Some more Keywords for Bing:
Manager assistant swap switch substitution
Forefront Identity Manager FIM
Microsoft Identity Manager MIM
User Profile Service Application UPSA UPA

Gratis forløb til uddannelse af lærere!

$
0
0

Hvordan kommer man i gang med de mange spændende digitale værktøjer, der tilbydes i Office 365? Det er ikke nemt, og vi ved at mange skoler og uddannelsesinstitutioner rundt om i landet kæmper med at få gang i de digitale hjul.

Jeg er ansat som Læringskonsulent, og mit arbejde består i at hjælpe og rådgive skoler på alle niveauer i digitalt understøttet undervisning. Jeg har lavet denne guide, der fortæller om vores forløb - og hvilke værktøjer, vi gerne vil hjælpe jer i gang med.

Der er tale om et samarbejde, hvor både skolen og Microsoft afstemmer, hvordan det bedst faciliteres. Vi vil gøre vores bedste for at få jer godt i gang med vores værktøjer.

Læs guiden nedenfor eller via dette link. Hvis du ser guiden nedenfor, kan det anbefales at gøre Sway'en til fuld skærm via udvidelsesknappen i højre side.

 

 

 

“Discovered apps” node in Microsoft Intune on Azure console

$
0
0

By Matt Shadbolt | Senior Service Engineer | https://blogs.technet.microsoft.com/ConfigMgrDogs/

In the new Microsoft Intune on Azure administration console, there is a new “Discovered apps” node available for each MDM enrolled device.

clip_image002

There’s been some recent confusion around what we should expect to see in here.

The Discovered apps node is a direct reflection of the devices discovered apps at the last Hardware Inventory time.

For devices with Device Ownership marked as Corporate this will be all apps installed on the device. For devices with Device Ownership marked as Personal this will be all apps installed via the Intune Company Portal or apps installed in a Required deployment.

The list of apps displayed here are only reflective of those apps installed at the last inventory scan. Please be aware that inventory is run every 7 days for mobile devices, so the Discovered apps list could potentially be up to seven days out of date.

“Discovered apps” node in Microsoft Intune on Azure console

$
0
0

In the new Microsoft Intune on Azure administration console, there is a new “Discovered apps” node available for each MDM enrolled device.

clip_image002

There’s been some recent confusion around what we should expect to see in here.

The Discovered apps node is a direct reflection of the devices discovered apps at the last Hardware Inventory time.

For devices with Device Ownership marked as Corporate this will be all apps installed on the device. For devices with Device Ownership marked as Personal this will be all apps installed via the Intune Company Portal or apps installed in a Required deployment.

The list of apps displayed here are only reflective of those apps installed at the last inventory scan. Please be aware that inventory is run every 7 days for mobile devices, so the Discovered apps list could potentially be up to seven days out of date.

ConfigMgr 1702: Adding a new node (Secondary Replica) to an existing SQL AO AG

$
0
0

Scenario:

We already have a working Primary and secondary replica and we know that ConfigMgr 1702 does support an extra replica i.e. 2nd Secondary replica. So we are adding a freshly built node as a secondary replica.

 

The documentation we have around this can be found below.

https://docs.microsoft.com/en-us/sccm/core/servers/deploy/configure/configure-aoag#add-and-remove-replica-members

To add a new replica member

  1. Add the new server as a secondary replica to the availability group. See Add a Secondary Replica to an Availability Group (SQL Server) in the SQL Server documentation library.
  2. Stop the Configuration Manager site by running Preinst.exe /stopsite. See Hierarchy Maintenance Tool.
  3. Use SQL Server to create a backup of the site database from the primary replica, and then restore that backup to the new secondary replica server. See Create a Full Database Backup and Restore a Database Backup using SSMS in the SQL Server documentation.
  4. Configure each secondary replica. Perform the following actions for each secondary replica in the availability group:
  1. Ensure the computer account of the site server is a member of the Local Administrators group on each computer that is a member of the availability group.
  2. Run the verification script from the prerequisites to confirm that the site database on each replica is correctly configured.
  3. If it’s necessary to configure the new replica, manually failover the primary replica to the new secondary replica and then make the required settings. See Perform a Planned Manual Failover of an Availability Group in the SQL Server documentation.
  1. Restart the site by starting the Site Component Manager (sitecomp) and SMS_Executive services.

 

Issues with the above approach:

 

Now what we do not consider above is the fact that there are many things which are critical to us that is not synched when a new replica is set up. The SQL replica will not sync any instanceServer level objects but it only sync database level objects.

So what do we miss synching?

 

  1. ConfigMgrEndpoint (This is the SSB endpoint and would not be synced as it is a server level object)
  2. ConfigMgr SSB certificates (Same case as the above)
  3. ConfigMgr Broker Logins ( The users that scope at the DB level are synched but the logins won’t be synched as they are Server level.)
  4. ConfigMgr SQL Identification Cert (This is used to authenticate the Site server while connecting to SQL DB. We don’t have to manually create this as SiteComp has a check to create this. But does require manual intervention and restarting SiteComp twice to create this.)

 

Hence working on this issue we did come on the things to do for such addition. After some good back and forth troubleshooting, Sean Mahoney helped to get this check list compiled. Below is the scenario for adding a new node for a Primary Site. It is highly recommended to open a CSS case so that we can help you perform these things as they are dynamic dependent on where the steps are performed.

 

1.       Validate that Site server is a Local Admin on SQL Server

2.       Validate there is a SPN for new SQL Node

3.       Validate SQL Aliases on SQL Server

4.       Validate SQL Aliases on Site Server

5.       Add New node to Windows Failover Cluster

6.       Enable Always On to SQL Service on new Replica node and restart SQL Service

7.       Backup SSB Cert on CAS

 

USE MASTER

Backup Certificate ConfigMgrEndpointCert TO FILE='C:TempCAS.CER'

 

8.       Copy Certificate to Primary

9.       Add Site Server as New Replica DB:

   

                CREATE LOGIN [DOMAINSITESERVER$] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]

ALTER SERVER ROLE [sysadmin] ADD MEMBER [DOMAINSITESERVER$]

ALTER SERVER ROLE [securityadmin] ADD MEMBER [DOMAINSITESERVER$]

 

9.       Stop Transaction Log Backup

10.   Add New SQL Replica to AO AG

11.   Stop CM Site

12.   Failover to New Replica and run script:

   

                DECLARE @DBNAME NVARCHAR(128)

SELECT @DBNAME = 'CM_<Site>' -- DBName

 

EXECUTE ('

USE ' + @DBNAME + '

 

ALTER DATABASE ' + @DBNAME + ' SET HONOR_BROKER_PRIORITY ON

ALTER DATABASE ' + @DBNAME + ' SET TRUSTWORTHY ON

 

EXEC sp_configure ''show advanced options'', 1;

RECONFIGURE;

 

EXEC sp_configure ''clr enabled'', 1;

RECONFIGURE;

 

EXEC sp_configure ''max text repl size (B)'', 2147483647;

RECONFIGURE;

 

EXEC sp_changedbowner ''sa'' ;

')

 

 

13.   Start Transaction Log backup job

14.   Fail back to original Replica

15.   Start Services (had to restart sitecomp 2x to get SQL Certificates created)

16.   Validate ConfigMgr SQL Server Identification Certificate is in Personal Store of new Replica SQL Server

17.   Validate ConfigMgr SQL Server Identification Certificate is in the Trusted People certificate store on the Site Server

18.   Manually Add the Certificate to the SQL Server Protocol using SQL Server Configuration Manager and Restart SQL Service on new Replica

19.   Fail over to new Replica (This add the SSB Certificate to the CM Database)

20.   Add SQL Broker Endpoint

       

 

declare @XMLParam XML;

select @XMLParam= Body from XMLConfigStore where name = 'ServiceBrokerConfiguration'

exec spConfigureServiceBroker @XMLConfig = @XMLParam, @SSBPort = 4022, @SqlCertFile = 'd:CAS.cer', @ParentSiteCode = '<CASSiteCode>' , @ParentSiteSqlServerFqdn = '<CAS SQL Server FQDN>' 

 

 

 

18.   Export SSB Certificate from Primary

 

 USE MASTER

Backup Certificate ConfigMgrEndpointCert TO FILE='C:Temp<PRISiteCode>.CER'

 

19.   Copy Cert to CAS SQL Server

 

Assuming CAS is also running SQL AO AG with two nodes.

 

20.   Import New Primary Site SSB Certificate to CAS Node1

 

                Exec dbo.spCreateSSBLogin

@EndPointLogin='ConfigMgrEndpointLogin<PRISiteCode>',

@DestSiteCode='<PRISiteCode>',

@DestSiteCertFile='C:<PRISiteCode>.cer',

@EndpointName='ConfigMgrEndpoint',

@DestSqlServerFqdn='<PRISQLListenerFQDN>'

 

  1. Fail CAS over to Node 2
  2. Import New Primary Site SSB Certificate to CAS Node 2

 

Exec dbo.spCreateSSBLogin

@EndPointLogin='ConfigMgrEndpointLogin<PRISiteCode>',

@DestSiteCode='<PRISiteCode>',

@DestSiteCertFile='C:<PRISiteCode>.cer',

@EndpointName='ConfigMgrEndpoint',

@DestSqlServerFqdn='<PRISQLListenerFQDN>'

 

Repeat steps for third node if needed.

 

Now if the Node addition scenario happens to be a CAS site then the certificates from all primaries will need to be reimported on the new node.

 

We are working on to change this behavior for more automated way in ConfigMgr 1710. Hope it helps!

 

Sean Mahoney | Sr. PFE, Microsoft
Umair Khan | SEE, Microsoft

 

Disclaimer: This posting is provided “AS IS” with no warranties and confers no rights.

OMS Upgrade Readiness with Configuration Manager

$
0
0

Introduction

Upgrade Readiness is offered as a solution in the Microsoft Operations Management Suite (OMS), a collection of cloud based services for managing on premise and cloud computing environments. If you’re already using OMS, you’ll find Upgrade Readiness in the Solutions Gallery. Click the Upgrade Readiness tile in the gallery and then click Add on the solution’s details page. Upgrade Readiness is now visible in your workspace. If you are not using OMS, go to the Upgrade Readiness page on Microsoft.com and select Sign up to kick off the OMS onboarding process. During the onboarding process, you’ll create an OMS workspace and add the Upgrade Readiness solution to it.

You can use Upgrade Readiness to prioritize and work through application and driver issues, assign and track issue resolution status, and identify computers that are ready to upgrade. Upgrade Readiness enables you to deploy Windows with confidence, knowing that you’ve addressed potential blocking issues.

  • Based on telemetry data from user computers, Upgrade Readiness identifies application and driver compatibility issues that may block Windows upgrades, allowing you to make data-driven decisions about your organization’s upgrade readiness.
  • Information is refreshed daily so you can monitor upgrade progress. Any changes your team makes, such as assigning application importance and marking applications as ready to upgrade, are reflected 24 hours after you make them.

 

 

Prerequisites

  • In order to add the connection, your Configuration Manager environment must first configure a service connection point in an online mode. When you add the connection to your environment, it will also install the Microsoft Monitoring Agent on the machine running this site system role.
  • Register Configuration Manager as a “Web Application and/or Web API” management tool, and get the client ID from this registration.
  • Create a client key for the registered management tool in Azure Active Directory.
  • In the Azure Management Portal, provide the registered web app with permission to access OMS, as described in Provide Configuration Manager with permissions to OMS.

 

 

Environment

Demo System Roles Names OS Network NICs  Required KBs
Domain Controller DC1 Windows Server 2016 Local
Configuration manager CM1 Windows Server 2016

Configuration Manager 1702

Local

Internet

Windows 7 SP1 Win7  

Windows 7

Local

Internet

KB2952664

KB 3150513

Windows 8.1 Win8 Windows 8.1 Local

Internet

KB 2976978

KB 3150513

 

 

Integrate Upgrade Readiness with Configuration manager

 

Create the connection

  1. In the Configuration Manager console, choose Administration > Cloud Services > Upgrade Readiness Connector > Create Connection to Upgrade Analytics to start the Add Upgrade Analytics Connection Wizard.

 

2. On the Azure Active Directory screen, provide Tenant, Client ID, and Client secret key, then select Next.

 

  1. On the Upgrade Readiness screen, provide your connection settings by filling in your Azure subscription, Azure resource group, and Operations Management Suite workspace.
  2. Verify your connection settings on the Summary screen, then select Next. Note You must connect Upgrade Readiness to the top-tier site in your hierarchy. If you connect Upgrade Readiness to a standalone primary site and then add a central administration site to your environment, you must delete and recreate the OMS connection within the new hierarchy.

Complete Upgrade Readiness tasks

After you've create the connection in Configuration Manager, perform these tasks, as described in Get started with Upgrade Readiness.

  1. Add the UpgradeReadiness service to the OMS workspace.
  2. Generate a commercial ID.
  3. Subscribe to Upgrade Readiness.

 

 

Configure and deploy Upgrade Readiness deployment script.

 

  1. Download Upgrade Readiness Deployment Script from below location:

https://www.microsoft.com/en-us/download/details.aspx?id=53327

 

  1. Extract the files and modify 'RunConfig.bat' file

Modify values as below:

set logPath=\cm1UpgradeReadinessLogs

set commercialIDValue=0144xxx-xxxx-xxxx-a321-xxxxxx7xxx

set logMode=1

 

 

 

Note: By default, proxy settings are set to ClientProxy=Direct. If Client machines have internet access through proxy server then below 2 extra values need to be updated in 'RunConfig.bat' file

Configure the Windows Registry value: on client computers H KL M "SOFTWA REA, P 01 Co I lectionDisabIeEnterpriseA uthProxy to 0 on client computers ' src="file:///C:/Users/rogoel/AppData/Local/Packages/oice_16_974fa576_32c1d314_2d81/AC/Temp/msohtmlclip1/01/clip_image019.jpg" border="0" v:shapes="Picture_x0020_18">

  1. Save the file at the location accessible while creating package in Configuration manager

 

Create Config manager package/Program to deploy Script to client machines

It makes more sense to deploy upgrade readiness script to client machines through Configuration manager if number of clients are large

 

  1. Create upgrade readiness package and program to deploy script

 

  1. Set the command line as 'RunConfig.bat'

 

  1. Set the requirements for preferred platforms. Setting it to Windows 7 and Windows 8.1 64-bit

 

  1. Update distribution point
  2. Create Test collection and move client machines to new collection
  3. Deploy 'UgradeReadinessScript' to 'Windows Upgrade Readiness' collection
  4. Refresh Policies on Client machines
  5. Once client get the policies from Management point, script will get installed

  1. On Windows machine 7

  1. Check log file for script successfully completion on CM1 Server

Upgrade Readiness view on Azure

It is important to disable Upgrade readiness demo data in OMS before we could see the actual data coming in. For disabling the Demo mode, Go to ‘Upgrade readiness’ settings and select ‘Disable’ switch for Demo Mode.

It could take up to 48-72 hours to start seeing the data online.

‘Upgrade Readiness’ portal view once data start populating

Configuration Manager ‘Upgrade Analytics’ data populated through connector in configuration manager | Monitoring | ‘Upgrade Analytics’. This data could also take 48-72 hours to start populating once data is visible on OMS.

 

 

Troubleshooting Upgrade Readiness

Upgrade Readiness deployment script

https://docs.microsoft.com/en-us/windows/deployment/upgrade/troubleshoot-upgrade-readiness

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>