Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

Excel 中的自助式 Power BI 實作:Power Pivot 篇

$
0
0

Power BI for Office 365

上篇文章中我們介紹了 Power Query,而今天讓我們承襲上篇文章的範例,繼續進到續集,來介紹第二個主題:Power Pivot

有了 Power Pivot,您可以從各種資料來源建立自己的資料模型,並依照本身需要確切設定模型與結構,同時還可以經常視情況從原始來源重新整理資料。現在將資料載入 Excel 優異的自助資料模型功能:Power Pivot。 Power Pivot 可讓您在 Excel 中直接建立及管理資料表和關聯性的集合。

Chapter 1:資料載入

我們將附加的 S&P 500 NYSE 和 NASDAQ 每日資料載入到資料模型,但若同時也希望載入基本 S&P 500 資料表。要做到這一點非常容易,只要選取載入了 S&P 500 清單的 Excel 工作表索引標籤,然後從工具列上的 [Power Pivot] 索引標籤選取 [新增至資料模型]

當新增資料表到資料模型時,Power Pivot 會在另一個視窗中開啟,並顯示新增至模型的資料表。我們也可以從 [Power Pivot] 工具列中選取 [管理] 圖示,來開啟 Power Pivot 視窗。

在 Power Pivot 中,資料模型中的資料表會以索引標籤形式顯示,和資料表在 Excel 中的顯示方式非常類似。我們先將剛才新增的資料表重新命名為 SP 500 (按兩下該索引標籤,並輸入新名稱)。

Chapter 2:精簡您的資料

當逐一查看模型中的資料時,我們發現遺漏了重要的觀點:年度績效。這時可以抽出第一天的資料和最後一天的資料 (例如價格),然後直接比較差異。 如此非常簡單且快速。

當我們已製作了合併及附加所有每日資料的查詢,也就是命名為 NYSE – NASDAQ – SP500 附加的查訊。若我們需要相同查詢的精簡版資料,精簡程度為僅包含第一天和最後一天的資料。這時先要回到 Excel 中,找出查詢 (在 Power Pivot 和 Excel 之間切換非常簡單,只需要選取 Excel 視窗即可。 不需要關閉 Power Pivot 視窗)。

回到 Excel 後,我們先重啟 [活頁簿查詢] 窗格 (選取 [Power Query] > [管理查詢] > [活頁簿],重新開啟窗格)。會發現再次選取 [活頁簿] 功能區按鈕會關閉 [活頁簿查詢] 窗格,但我們希望窗格保持開啟,因此再按一下 [活頁簿] 按鈕。

 [活頁簿查詢] 窗格中找到查詢,接著瀏覽到 [資料表工具] 中的 [查詢] 索引標籤,然後選取 [複製]。如此可使用先前建立的查詢開始作業,首先附加合併的 NYSE 和 NASDAQ 資料,然後篩選到僅包含一年當中第一個與最後一個交易日。

複製的查詢會顯示在 [活頁簿查詢] 窗格底部,名稱為 NYSE – NASDAQ – SP500 附加 (2)。 當游標停留在該查詢上時,會出現包含資料預覽的飛出式視窗。我們從飛出式視窗的底部選取了 [編輯查詢]

[查詢視窗] 隨即出現,可以在視窗中設定,限制 [NewColumn.date] 欄位只能包含第一個交易日。 當瀏覽到[NewColumn.date] 欄位,發現該欄位的資料類型並非使用 [日期] 格式 。

要修正這一點很容易,在 [查詢編輯器] 中即可輕鬆完成。選取 [NewColumn.date] 資料行,然後從功能區[轉換] 區段中的 [資料類型] 下拉式清單中選取 [日期]

資料類型設為 [日期] 後,便可以從該資料行篩選日期,讓資料僅包含一年當中的第一個交易日。

接著只需要建立查詢、取得第一天的資料、複製查詢以取得最後一天的資料,然後將兩個查詢中的資料合併到單一資料表中即可進行所要的計算 (第一個和最後一個交易日的資料需要置於模型中的單一筆記錄或列中)。現在,讓我們釐清這些步驟,這樣一來,您就可以使用自己的 Excel 活頁簿順利地跟著操作。

1. 從 [NewColumn.date] 欄中只選取第一個交易日 (1/2/2009),然後按一下 [確定]

2. 在 [查詢編輯器] 中,從每個欄移除 NewColumn. 字首。 如此可以增進欄位的可讀性。 以滑鼠右鍵按一下資料行,然後從出現的功能表中選擇 [重新命名...] 

3. 移除下列資料行 (範例中我們不需要這些資料行):[SEC 存檔][初次新增日期][stock_symbol] (重複資料行)、[stock_gain_loss_dollar][stock_gain_loss_percent][dollar_volume]  [stock_volume]

4. 重新命名查詢為 [第一個交易日],並清除 [載入至工作表] 核取方塊。

5. 從功能區選取 [套用並關閉]。 查詢會出現在 [活頁簿查詢] 窗格中。

6. 以滑鼠右鍵按一下 [活頁簿查詢] 窗格中的 [第一個交易日] 查詢,然後選取 [複製]。 複製的查詢會新增至 [活頁簿查詢] 窗格,名稱為 [第一個交易日 (2)]。 您可能需要在 [活頁簿查詢] 窗格中向下捲動,以查看該查詢。

7. 以滑鼠右鍵按一下 [第一個交易日 (2)],然後從出現的功能表中選取 [編輯查詢]

接下來的幾個步驟比較難處理,但可以充分展現 Power Query 的靈活性和強大的功能。 這些步驟也能說明 Power Query 一項優異的功能,以及它是如何處理查詢的重塑與篩選步驟。

當選取 [第一個交易日 (2)]  [編輯查詢] 時,會出現下列畫面。 請注意 [日期] 欄位,以及 [查詢設定] 窗格中的 [套用步驟] 區段,其中最後項目 ([RemovedColumns]) 已選取。

當選取上述其中一個步驟時,查詢中的資料會自動回復為查詢重塑過程中時間點的狀態 (且未套用篩選)。 例如,當選取 [來源] ([套用步驟] 中第一個套用的步驟) 時,資料會回復為將該步驟套用到查詢時的狀態。

您會發現資料行名稱回復了,現在名稱中出現了 NewColumn. 前置詞。 您也會留意到日期欄位回復為一般資料類型,且 (雖然不太容易看出來) 日期欄位沒有套用任何篩選,因此所有交易日都出現在資料集中。

當選取 [套用步驟] 中的第四個步驟 ([RenamedColumns] 步驟) 時,資料會顯示為資料重塑過程中進行該步驟時的狀態。 在此時,只剩下移除未使用資料行的變更。

若希望為這個查詢進行變更,讓它篩選出一年當中的最後一個交易日,而非一年當中的第一個交易日。 僅須採取下列步驟,從我們先前結束的地方開始:

8. 在 [查詢設定] 窗格內 [套用步驟] 區段中的 [FilteredRows] 項目右側,按一下齒輪圖示。 下列視窗隨即出現。

9. 在顯示 1/2/2009 的欄位中選取下拉式箭號,然後選取最後一個項目,亦即 12/31/2009。 選取 [確定]

10. 選取 [套用步驟] 中的最後一個項目 ([RemovedColumns] 步驟)。

11. 將查詢重新命名為 [最後一個交易日]

12. 選取 [套用並關閉]

接下來,合併這兩個查詢。 以下是我們會採取的步驟:

1. 從功能區上的 [Power Query] 索引標籤中選取 [合併] 
  

2. 會顯示 [合併] 視窗。 將 [第一個交易日]  [最後一個交易日] 查詢指定為要合併的資料表,然後選取 [行情指示器代號] 為相符資料行。 選取 [確定]

3. 和過去一樣,來自合併查詢中的資料行會顯示為資料表,位於查詢結尾名稱為 [NewColumn] 的資料行下方。 按一下雙箭號圖示,將該資料表展開成個別資料行。
  

4. 將查詢重新命名為 [SP500 年度資料]

5. 選取 [載入至資料模型] 旁的核取方塊,然後選取 [套用並關閉]

完成後,我們的資料模型中現在有了包含第一和最後一天收盤價資料的新資料表。 資料模型中的資料表裡包含了一些不需要的資料行,以及一些希望重新命名的資料行。不成問題,因為這些作業都可以在 Power Pivot 中完成。

我們會從 Power Pivot 中進行下列變更:

1. 將 [日期] 資料行重新命名為 [第一天的日期],並將 [NewColumn.date] 資料行重新命名為 [最後一天的日期]

2. 將 [stock_price_close] 重新命名為 [第一天收盤價],並將 [NewColumn.stock_price_close] 重新命名為 [最後一天收盤價]

3. 從 [GICS 類股]  [GICS 子產業] 的資料行名稱中移除 GICS

4. 刪除下列重複資料行 (也可以在 Power Query 中進行此作業):[關鍵][NewColumn.行情指示器代號][NewColumn.公司][NewColumn.GICS 類股][NewColumn.GICS 子產業][NewColumn.總部地址][NewColumn.關鍵] 以及 [NewColumn.交易]

重新命名這些資料行會在稍後當我們從資料建立報表時有所助益。

Chapter 3:建立運算

現在,我們已在資料模型中備妥年度資料,可以 (使用金額及百分比為單位) 計算各股票的整體漲幅。 在 Power Pivot 中,您可以在任何資料表中建立新資料行,並在這些資料行中使用可執行多種不同功能的公式。 這些計算通常稱為計算欄位。

在 Power Pivot 中,計算欄位會使用資料分析運算式 (DAX) 公式。 DAX 公式非常類似 Excel 中的公式,能同時在 Power Pivot 和樞紐分析表中運作。 如果您熟知如何在 Excel 中建立公式,使用 DAX 對您不會有任何難度。

我們先建立的第一個公式,會以金額為單位計算年度漲幅或跌幅。首先,在 [新增資料行] 中選取儲存格,接著在 DAX 資料編輯列中輸入公式。當輸入時,Power Pivot 會根據資料模型中可用的資料表和欄位做出建議。

公式完成時,會針對資料表中的每一列計算結果。我們新增了一些其他資料行,包括一年當中的漲幅百分比,並為各資料行套用適當的資料類型和格式 。以下是建立的計算 (公式列於括弧中):

  • 上漲或下跌 - 金額 (=[Last Day Close]-[First Day Close])
  • 上漲或下跌 - 百分比 (=([Last Day Close]-[First Day Close])/[First Day Close])
  • 類股 (=RANKX('SP500 Annual Data',[Gain or Loss - dollars],))

除了新資料行,若想要建立可提供彙總合計等功能的計算欄位,可透過選取 Power Pivot 計算區域中的儲存格來達到這個目的,計算區域是介於資料表資料和索引標籤之間的儲存格集合。 計算欄位可以使用計算區域置於任何儲存格。

我們建立了下列計算欄位。如果您在自己的活頁簿中照著做,且使用相同的資料行名稱,您可以直接複製以下各項內容並貼到計算區域中的儲存格:

  • 上漲或下跌總和 - 金額:=SUM([Gain or Loss - dollars])
  • 平均漲幅 - 金額:=AVERAGE([Gain or Loss - dollars])
  • 平均漲幅 - 百分比:=AVERAGE([Gain or Loss - percent])
  • 表現最佳股票 - 金額:=MAX([Gain or Loss - dollars])
  • 表現最差股票 - 金額:=MIN([Gain or Loss - dollars])
  • 表現最佳股票 - 百分比:=MAX([Gain or Loss - percent])
  • 表現最差股票 - 百分比:=MIN([Gain or Loss - percent])

Chapter 4:建立關聯

在 Power Pivot 中,您也可以定義資料表之間的關係。 關聯會在資料表之間建立連線,是以各資料表中包含類似或相同資料的資料行為基礎。 關聯可讓您建立包含來自相關資料表資料的報表。

以行情指示器代號為基礎,在 SP 500 資料表和資料模型中的另外兩個資料表之間建立關聯。可以在 Power Pivot 的 [圖表檢視] 中使用簡單的拖放動作完成這項作業。在 Power Pivot 中,是從 [常用] 功能區的[檢視] 區段選取 [圖表檢視]

為建立關聯,會將主要資料表的欄位拖曳到應建立關聯之資料表中的相應欄位。兩個資料表間會顯示線條,代表兩者間的關聯。可以選取線條,為相關聯的欄位加上醒目提示。

 SP500 年度資料資料表中的 [行情指示器代號] 欄位,拖曳到 SP500 資料表中的 [行情指示器代號] 欄位,以建立關聯。 我們看了一下 [圖表檢視] 中另一個資料表的名稱,該名稱相當長:NYSE___NASDAQ___SP500_Append,所以決定將它重新命名,這項作業可以直接在 [圖表檢視] 內進行 (按兩下資料表的名稱,然後輸入新的名稱:SP500 每日資料)。

 SP500 每日資料資料表中的 [行情指示器代號] 欄位拖曳到 SP 500 資料表中的 [行情指示器代號] 欄位,建立了另一個關聯。 選取資料表間的關聯線條,會為關聯加上醒目提示。我們的 Power Pivot [圖表檢視] 現在的外觀正如以下畫面所示。

Chapter 5:建立階層

在 Power Pivot 中,所謂的階層是指共用邏輯上下關聯的資料元素群組。 例如,地理階層可能是「省/市」、「縣/市」和「鄉/鎮/市/區」;「省/市」是高於「縣/市」的階層 (且包含多個「縣/市」),而「縣/市」高於「鄉/鎮/市/區」(且包含多個「鄉/鎮/市/區」)。

在 Power Pivot 中使用階層,可以讓您建立可向下切入資料的報表。假設我們認為類股和子產業階層很值得觀察,尤其是如果能建立可供向下切入特定部門資料的報表,會更有意義。 在 Power Pivot 中建立階層的方式很多;我們決定在 [圖表檢視] 中建立階層。

為在 SP500 年度資料資料表中建立階層,在 [圖表檢視] 中以滑鼠右鍵按一下資料表,此時會出現可供建立階層的功能表。

突然,我們留意到 SP 500 資料表的 [類股]  [子產業] 欄位前面仍有 GICS 字詞。 雖然我們並不是要在 SP 500 資料表中建立階層 (而是要在 SP500 年度資料資料表中建立),但還是希望能修正這個問題。 進一步檢查後,發現 SP500 每日資料資料表的欄位也有這個希望去掉的字首。此時可以直接從 [圖表檢視] 中修正這個問題 (按兩下 SP 500 資料表中的 [GICS 類股] 欄位,然後去除 GICS 字首)。我們也為其他三個欄位進行同樣的作業。

現在,可以開始建立階層。選取 [SP500 年度資料] 並將階層命名為 [類股和子產業],同時也納入了 SP500 年度資料資料表中的 [類股] 欄位和 [子產業] 欄位。這些欄位排列的順序構成了階層。

在完成關聯與階層的建立後,現在可以開始建立報表。我們希望製作動態且吸引人的報表,加上許多酷炫的視覺效果,也希望報表是互動式的,這樣一來,當發佈報表時,同事就能夠以有意義或有趣的方式,檢閱並分析資料。 Power View 就是能做到這一點 (更有其他更多優點) 的自助式 BI 功能。

Power Pivot 摘要

在 Power Pivot 中,您可以進行自訂、運用計算和階層擴充功能,以及管理強大的 Excel 資料模型。 Power Pivot 能順暢且自動與 Power Query 搭配運作,也能和其他 Excel 功能配合使用,讓您可以在熟悉的 Excel 環境中 (運用計算和階層) 管理與擴充您的自訂資料庫 (資料模型)。 Power Pivot 包含任何您從 Power Query 納入的資料,以及任何其他您新增至模型的資料。 此外,Power Pivot 中的整個資料模型 (包括資料表、資料行、計算、階層和任何其他自訂內容) 都會在 Power View 中公開為報表可用的元素。

如需瞭解 Power Pivot 和 DAX 的詳細資訊,請參考下列連結:


Microsoft Azure SQL Database Basic, Standard 與 Premium 間之差異

$
0
0

2014年4月宣布了新的 Microsoft Azure SQL Database 服務來取代既有的 Microsoft SQL Database Business/Web Edition。2014 年 8 月 26 日 SQL Server 產品主管 Eron Kelly宣布新版本 Microsoft SQL Database Basic, Standard 與 Premium 版即將於 2014 年 9 月脫離技術預覽階段,開始正式營運 ( http://azure.microsoft.com/blog/2014/08/26/new-azure-sql-database-service-tiers-generally-available-in-september-with-reduced-pricing-and-enhanced-sla/ ),新的雲端資料庫服務,與過去版本相較有了以下的改善 :

  • 不停機服務水準 (SLA) 由 99.9% 提升為 99.99 %
  • 單一資料庫容量上限提高
  • 較可預期的執行效能
  • 用戶可以自行回存資料 (Self-service restore) ,依據不同等級版本可回溯資料庫時間從 7-35 天不等
  • 以小時為單位計價
  • 高階版本提供跨資料中心災難備援機制

與傳統 Microsoft SQL Server 規劃相較,客戶 Microsoft Azure SQL Database 不同等級的選用,可以參考 http://msdn.microsoft.com/library/azure/dn369873.aspx,在此節錄最重要的表格, 新公布的 Standard/S0 在 2014/8/30 時尚未更新進此一表格,先就已經公布的 DTU 數據做些調整。

Azure SQL Database 等級

Database Throughput Units (DTUs)

單一資料庫容量上限 (GB)

Worker Threads 上限

Sessions 數上限

預期效能

Basic5220100
Standard/S010待確認待確認待確認較好
Standard/S12025050200較好
Standard/S250250100500較好
Premium/P11005002002,000最佳
Premium/P22005004004,000最佳
Premium/P38005001,60016,000最佳

 

資料庫庫吞吐量單元(Database Throughput Unit ,DTU):這是一個綜合多項能力的單位,結合了 CPU,記憶體,資料讀寫能力成為一個單位。 理論上 5 DTU 的效能水準比 1 DTU 要多五倍,Worker thread 在邏輯上表示 Microsoft Azure SQL Database 允許的執行緒數量上限,可以視為是作業系統允許的執行緒數量上限,隱身在資料庫服務背後 ;Worker thread 默默地執行資料庫服務所指派的工作。而 Sessions 數則是指邏輯上伺服器端與用戶端所建立能夠交換資料的單位,Session 數實際上並不等同於實體上網路 Connection 連線數,但兩者間數量差異不大,可以視為是能夠允許的網路連線數量。雲端服務的特質在於資源共享,資源共享也意味著必須限制單一用戶的用量,以避免其他租戶使用時受到影響,因此在資料庫規劃上需要隨時注意相關資訊。

Using language R and Azure Machine Learning to load data from Azure SQL Database

$
0
0

This blog post is composed of two parts:

In the first part, we will see how to load data from Azure SQL Database using the R Language installed on my local machine.

In the second part, we will access the same data using Azure Machine Learning, a fully managed cloud service for data scientists and developers, currently in preview, which provides a great support to language R.

Prerequisites

To load data from Azure, we need first to create an Azure SQL Database. If you are new to Azure, you can activate a 30 days free trial here. The process of creating a new SQL Database server is very easy, and a getting started guide is available here. Once created, you will have access to a control panel, represented in the image below.

SQL Database is a fully managed service, and Azure will take care of things like high availability and backups.

To design the database schema and load data, you can use SQL Server tools like SSMS, which are fully compatible. In our example we will use a dataset with the history of winners of the Italian soccer league (I am a Juventus supporter, so I like the query results shown below very much :) )

  

Part 1 – load Azure SQL Database data using R

In the first part, we will use R installed locally on my Windows 8.1 laptop. To accomplish this part we need to:

  1. Configure an ODBC dsn to Azure SQL Database

  2. Download and Install R bits here: http://cran.r-project.org/bin/windows/base/

How to create an ODBC dsn

In the first step of the wizard, we need to provide a name for the dsn (sqldb) and the name of the Azure SQL Database Server ([servername].database.windows.net)

We need to specify the default database (ItalianSoccerLeague)

And SQL login and password

 

Using R to load SQL Database data

Now we have our ODBC dsn ready. R has lots of packages available, and one of them, called RODBC allows the communication with ODBC data sources. To install a package R we can use the command install.packages(“[package name”)]. The package will be downloaded from the web

Once that the package is installed, we need to load the package, using the following sintax

Now we can open a connection to the Azure SQL Database, using the ODBC dsn, plus the User ID and the Password. odbcConnect is the command that we will use, and the result is assigned to the sqldbcnn variable

Last step, using sqlQuery, is to execute the query

If we need to get a summary of the results, we can use the summary function, passing the results variable as a parameter.

 

Part 2 – load Azure SQL Database data using Azure Machine Learning

 

Azure Machine Learning runs inside Microsoft Azure, and it is currently in preview. To get started with Azure Machine Learning you could visit the following page: http://azure.microsoft.com/en-us/trial/get-started-machine-learning/

As we have done in part 1 with R running on my laptop, we will now use Azure Machine Learning to load data from Azure SQL Database. Azure ML provides a designer, called Azure ML Studio, which is a workbench that runs in the browser, e.g. Internet Explorer. You don't need to install anything on your local machine, everything runs inside Azure

First thing to do is create a new experiment; this will give us access to a graphical designer, the place where the data scientist will spend most of his time.

Azure ML is made of many modules, some of them can helps us to load and manipulate data. In this case, we need the Reader module, and we just need to drag and drop it into the designer.

Each module has a properties pane, where we can set all the parameters required to connect to Azure SQL Database, including the query string.


That is all we need to do to get data from Azure SQL Database. As an optional step, we could also use a Descriptive Statistics module, and connect it to the Reader module. This will help to have a more detailed view of our dataset

Now we can run our experiment using the RUN icon  

We can visualize the data received from the database using the Visualize command in the context menu.

In addition, we could get some additional descriptive information, like a MultiboxPlot to see a graphical representation of our data, or Mean, Min and Max values.

 

 

Francesco

@francedit

PowerTip: Use PowerShell to Set Primary and Secondary DNS Server Addresses

$
0
0

Summary: Use Windows PowerShell to set the primary and secondary DNS server addresses for a client.

Hey, Scripting Guy! Question I recently changed the IP address scheme for an entire subnet. How can I use Windows PowerShell to set the
           primary and secondary DNS server addresses for the client workstations?

Hey, Scripting Guy! Answer Use the Set-DNSClientServerAddress cmdlet, and specify the primary and the secondary DNS servers as
           an array, for example:

Set-DNSClientServerAddress –interfaceIndex 12 –ServerAddresses (“10.0.0.1”,”10.0.0.2”)

AppV 5: Important Considerations and Ramifications for Package Upgrade and VFS Write Mode

$
0
0

If you are running any version of the App-V 5 client or Sequencer prior to App-V 5.0 Service Pack 2 Hotfix 4 – stop reading. This does not apply to your environment. If you are running HF4 or sooner, you need to have a good understanding of the net effects of toggling VFS Mode on and/or off during package upgrade.

VFS Write Mode was introduced to (my words) “fix bad brake jobs” when it comes to application development. Applications that need to be able to read and write configurations files in normally protected directories or have modified ACL’s for standard users in order to write to program locations that only administrators would normally have access to (Notepad++ has a way to go back and forth between good and bad program configuration - http://blogs.technet.com/b/gladiatormsft/archive/2014/05/24/app-v-5-on-sequencing-using-tokenized-paths-pvad-s-vfs-and-vfs-write-mode.aspx.)

While VFS Write Mode involves setting a specific attribute within the package manifest, the effects on how the VFS (Virtual File System) and COW (Copy-on-Write) filter are handled for the user are significant. As a result, making any changes to this setting during package upgrade could have some effects.


Scenario 1: Package Upgrade where the VFS Write Mode setting is not changed

This one is pretty straight-forward. No considerations are needed. Nothing is changed in how the VFS handles things.

 

Scenario 2: Package Upgrade where VFS Write Mode setting is turned on when previously not enabled.

When this happens and directories previously existed in the COW but there was no “S” directory, then the original directory will be renamed to the “S” version and the permissions appropriately adjusted. The new “regular” directory will be created. So instead of just one directory (i.e. ProgramFilesX86) there will be two (adding ProgramFilesX86S.) For more information on the “S” directories, please refer to my previous post here (http://blogs.technet.com/b/gladiatormsft/archive/2014/08/29/app-v-5-on-the-now-sometimes-immutable-package-cache-location.aspx.)

 

Scenario 3: Package Upgrade where VFS Write Mode setting is turned off when previous enabled.

In this scenario, there will likely be existing two structures: directories appended with “S” and the regular directory with the relaxed permissions. When this happens, the relaxed permissions directories will be deleted and the “S” directories will be renamed back to their original names. If this is ever done unintentionally, you can imagine some of the issues that may happen so be careful doing this.

 

I would advise always avoiding Scenario 3. Scenario 2 will be quite common as many sequencers and packagers are now upgrading their pre-SP2 HF4 App-V packages in order to take advantage of the new VFS Write mode feature. The question I am getting asked a lot lately is whether it is better to re-sequence with VFS Write Mode on or to simply just perform a package upgrade. I would advise trying the package upgrade first to turn on the value. In most cases, this should work – but as always, I await your comments.

Top Contributor Awards! Compatibility! Security! Tutoring! Customisation! Duplication! WE GOT IT ALL! And it's yours! And it's free! Happy Days!

$
0
0

Welcome back for another analysis of contributions to TechNet Wiki over the last week.

First up, the weekly leader board snapshot...

 

Ed (who is fast becoming a well known figure at TechNet Wiki) ROCKETS to the top of the list this week! Excellent work sir! Also showing high in the new articles rota!

 

As always, here are the results of another weekly crawl over the updated articles feed.

 

Ninja AwardMost Revisions Award  
Who has made the most individual revisions
 

 

#1 Richard Mueller with 92 revisions.

  

#2 Markus Vilcinskas with 87 revisions.

  

#3 Ed (DareDevil57) with 84 revisions.

  

Just behind the winners but also worth a mention are:

 

#4 Gokan Ozcifci with 84 revisions.

  

#5 saramgsilva with 66 revisions.

  

#6 asmalser with 65 revisions.

  

#7 Sandro Pereira with 57 revisions.

  

#8 Dana Bolze with 39 revisions.

  

#9 Carmelo La Monica with 29 revisions.

  

#10 Durval Ramos with 24 revisions.

  

 

Ninja AwardMost Articles Updated Award  
Who has updated the most articles
 

 

#1 Richard Mueller with 73 articles.

  

#2 Gokan Ozcifci with 61 articles.

  

#3 Ed (DareDevil57) with 50 articles.

  

Just behind the winners but also worth a mention are:

 

#4 Sandro Pereira with 21 articles.

  

#5 saramgsilva with 17 articles.

  

#6 Markus Vilcinskas with 13 articles.

  

#7 Durval Ramos with 10 articles.

  

#8 Davut EREN - TAT with 7 articles.

  

#9 Jefferson Castilho with 6 articles.

  

#10 Carmelo La Monica with 5 articles.

  

 

Ninja AwardMost Updated Article Award  
Largest amount of updated content in a single article
 

 

The article to have the most change this week was DUPLICATE: Deploy an Application with System Center Configuration Manager R2., by Ed (DareDevil57)

This week's reviser was Richard Mueller

Why highlight a duplicate, which may even be gone by the time you read this?

Well, partly because it was the article with the most change, in a minus direction of course, but also the love from Richard shows responsible Wiki conduct, thanks Richard. But also because the original version Ed posted here is an excellent read anyway.

 

Ninja AwardLongest Article Award  
Biggest article updated this week
 

 

This week's largest document to get some attention is Compatible Tape Libraries for System Center 2012 DPM, by Rayne Wiselman [MSFT]

This week's reviser was pwl63

Well I never! I didn't know ML6000 was compatible with System Center 2012!!! Probably because, in all honesty, I never needed to know...

But that's because I'm probably not as awesome as you and Rayne, so I bow before this monster article that continues to give love to those more awesome than me.

  

Ninja AwardMost Revised Article Award  
Article with the most revisions in a week
 

 

This week's most fiddled with article is TechNet Guru Contributions for August 2014, by XAML guy. It was revised 28 times last week.

This week's revisers were Carmelo La Monica, Bodo Michael Danitz, Ike Ugochuku - IdM-FIM Consultant, Andy ONeill, Nonki Takahashi, Mang Alex, Ed Price - MSFT, Nathan Foreman, Sandro Pereira, Johns-305 [boatseller], saramgsilva, mcosmin, Steef-Jan Wiggers, Jaliya Udagedara, Brian Nadjiwon, Gai3kannan& Mr X

Go go Gurus! The sensational flood of facts and food for thought continues to pour out of every pore of so many community minded individuals that enrich our lives with their own work and revelations. Let us bask in their glory! One day to go till August deadline!! 

 

As Guru often wins, I highlight this week's SECOND most fiddled with article, Securing BizTalk endpoints leveraging Sentinet API Management Part 2, by Steef-Jan Wiggers. It was revised 21 times last week.

This week's reviser was Steef-Jan Wiggers

Steef-Jan is another of those names I'm sure you're all familiar with, as he is a major figure in TechNet Wiki. His contributions are always worth a read, as he is a proven leader in BizTalk circles. This new arrival won't disappoint either!

  

Ninja AwardMost Popular Article Award  
Collaboration is the name of the game!
 

 

The article to be updated by the most people this week is Tutorial: Azure AD Integration with Bamboo HR, by Markus Vilcinskas

Skipping past the August Guru article, this comes second and worth the win. Although it currently says "work in progress, do not edit", five people have decided it's good enough for a few minor community love tweaks. 

This week's revisers were Dana Bolze, Richard Mueller, asmalser, Markus Vilcinskas& Ed (DareDevil57)

 

Also five editors for his Tutorial: Azure AD Integration with Samanage, by Markus Vilcinskas

This is a good thing, being read and tweaked is better than gathering cobwebs unseen in a corner of the Wiki. Great work so far Markus! 

This week's revisers were Dana Bolze, Richard Mueller, asmalser, Ed (DareDevil57)& Markus Vilcinskas

  

Ninja AwardNinja Edit Award  
A ninja needs lightning fast reactions!
 

 

Below is a list of this week's fastest ninja edits. That's an edit to an article after another person

  

Ninja AwardWinner Summary  
Let's celebrate our winners!
 

 

Below are a few statistics on this week's award winners. Still a little out of date, due to work loads.

Most Revisions Award Winner
The reviser is the winner of this category.

Richard Mueller

Richard Mueller has been interviewed on TechNet Wiki!

Richard Mueller has won 85 previous Top Contributor Awards. Most recent five shown below:

Richard Mueller has not yet had any featured articles or TechNet Guru medals (see below)

Richard Mueller's profile page



Most Articles Award Winner
The reviser is the winner of this category.

Richard Mueller

Richard Mueller is mentioned above.



Most Updated Article Award Winner
The author is the winner, as it is their article that has had the changes.

Ed (DareDevil57)

Ed (DareDevil57) has won 4 previous Top Contributor Awards:

Ed (DareDevil57) has not yet had any featured articles, interviews or TechNet Guru medals (see below)

Ed (DareDevil57)'s profile page



Longest Article Award Winner
The author is the winner, as it is their article that is so long!

Rayne Wiselman [MSFT]

Rayne Wiselman [MSFT] has won 3 previous Top Contributor Awards:

Rayne Wiselman [MSFT] has not yet had any featured articles, interviews or TechNet Guru medals (see below)

Rayne Wiselman [MSFT]'s profile page



Most Revised Article Winner
The author is the winner, as it is their article that has ben changed the most

XAML guy

XAML guy has featured articles on TechNet Wiki!

XAML guy has been interviewed on TechNet Wiki!

XAML guy has TechNet Guru medals, for the following articles:

XAML guy has won 54 previous Top Contributor Awards. Most recent five shown below:

XAML guy's profile page

Steef-Jan Wiggers

Steef-Jan Wiggers has featured articles on TechNet Wiki!

Steef-Jan Wiggers has been interviewed on TechNet Wiki!

Steef-Jan Wiggers has TechNet Guru medals, for the following articles:

Steef-Jan Wiggers has won 12 previous Top Contributor Awards. Most recent five shown below:

Steef-Jan Wiggers's profile page



Most Popular Article Winner
The author is the winner, as it is their article that has had the most attention.

Markus Vilcinskas

Markus Vilcinskas has featured articles on TechNet Wiki!

Markus Vilcinskas has been interviewed on TechNet Wiki!

Markus Vilcinskas has TechNet Guru medals, for the following articles:

Markus Vilcinskas has won 13 previous Top Contributor Awards. Most recent five shown below:

Markus Vilcinskas's profile page

Markus Vilcinskas

Markus Vilcinskas is mentioned above.



Ninja Edit Award Winner
The author is the reviser, for it is their hand that is quickest!

Ugur Demir - TAT

Ugur Demir - TAT has been interviewed on TechNet Wiki!

Ugur Demir - TAT has won 5 previous Top Contributor Awards:

Ugur Demir - TAT has not yet had any featured articles or TechNet Guru medals (see below)

Ugur Demir - TAT's profile page

Mehmet PARLAKYIGIT-TAT

Mehmet PARLAKYIGIT-TAT has been interviewed on TechNet Wiki!

Mehmet PARLAKYIGIT-TAT has won 12 previous Top Contributor Awards. Most recent five shown below:

Mehmet PARLAKYIGIT-TAT has not yet had any featured articles or TechNet Guru medals (see below)

Mehmet PARLAKYIGIT-TAT's profile page



Another week of eye watering delights from some great names in the community. Keep it up folks, I hunger for more!

 

Best regards,
Pete Laker (XAML guy)

 

E-Book gratuit

$
0
0

Si vous êtes friand de livres électroniques sur le domaine des technologies, vous allez être en avoir pour plusieurs jours de lecture. Voici le lien sur la plus grandes listes de livres gratuits sur les technologies Microsoft.

How to fix an issue to export Project Center to Excel

$
0
0

Sometimes we need to export Project Central content to excel to create a simple report with some comments, but usually we encounter a problem, when we try to do this export to excel we receive a message bellow:

The ActiveX control on which this feature depends could not be created. Because of this error, you can only copy the XML data to the Clipboard. Do you want to continue?”

To solve this problem we need to include Project Server as a trusted site and enable an activeX control.

Let's do it.  

  1. Add Project Server PWA web site as a trusted site on Internet Explorer;
  2. Edit trusted sites custom level security;
  3. Find option: “Initiate and script ActiveX control not marked as safe for scripting”
  4. Set this option to "Prompt"

Now you will be able to export Project Central content to Excel.


How to create a private cloud step by step with System Center part 1: Lab Setup

$
0
0

Today I decided to start a series of post while I'm building my home lab from scratch again, in this series of post I will cover all the basic knowledge you need to set a private cloud in a lab environment.

 

This series assumes you have a basic understanding of Hyper-V management and Virtual Machines manual deployment, basic Active Directory and DNS.

 

This post series will be in Spanish and English for your convenience.

 

When I went out to buy my setup components I had clear that I wanted a two node cluster and another machine for some administrative VMs and to be my storage provider with iSCSI to test the management capabilities for storage in VMM 2012 R2, and because I didn't need overclocking I found a pretty sweet deal in i7 4770 for the virtualization nodes and i5 4440 for the storage pc.

 

I had a bunch of SATA 3 disks from the updates I've been doing to my gaming setup lately, so I decided to reuse them as follows:

 

2 x Cluster Node PC:

Processor:

Core i7 4770

RAM:

32 gb 1600 Mhz

Storage:

500Gb HDD

Network:

3 Nics

 

Storage:

Processor:

Core i5 4440

RAM:

32 gb 1600 Mhz

Storage*:

120 Gb SSD for Operating System

2x120 Gb SSD + 240 Gb SSD for fast storage pool

2x500 Gb HDD for slow storage pool

Network:

2 Nics

*Notice I'm not going to use any RAID technology, just the storage pool capabilities of Windows Server.

 

With this setup I'm able to deploy my private cloud with no problems and test some advanced new features of VMM 2012 R2.

 

For the networking part I pretty much use my standard high end home router from linksys and the cheapest CISCO switch I could find that was managed and supports VLAN tagging in case I want it to test some VLAN isolation later on.

 

Preparing the setup for VMM:

 

1. Install Windows Server 2012 R2 in the Storage PC and apply all updates. 

 

2. Install Hyper-Role in the Storage PC.

 

3. Create a Directory to store your Virtual Machines and VHDX.

clip_image001

4. Create a blank VM Gen 2 called Template and Install Windows Server 2012 R2, this will be our template VHDX, you can also use this script to create your base VHDX.

 

5. If you decided to go from scratch in your template you need to apply sysprep to generalize the image and make it usable as a template:

    1. Run CMD as administrator.
    2. Go to %Windir%\System32\Sysprep.
    3. Run Sysprep.exe
    4. In the windows select Out of the Box Experience and Generalize.

 

6. Don't turn on the Template VM, go to the directory you created for your VMs and VHDX, and look for the VHDX of the VM Template you created and copy it to another subdirectory called DC and to another one called VMM, so we can have the base disks for our first two machines.

clip_image002 

clip_image003

clip_image004

7. Create 2 new VMs one called VMM and another one called DC01, and point them to the respective virtual disk you copied to each directory in the previous step.

clip_image005

8. Start both machines and logon as local administrator.

 

9. Promote the DC01 as a domain controller and DNS Server.

 

10. Create a virtual switch from one of the NICs, do not allow the management operating system to communicate through this, you have another Nic for management.

 

11. Join the VMM Server and the Storage Server to the domain.

 

After this basic setup we are ready to start installing our environment, I know a lot of you use differential disks for the lab setups, that's ok but I want to make this as real as possible so you can see some more detail in the process.

 

Next post: Installing VMM

Fibonacci Numbers

$
0
0

I know it's out of left field, but I thought it would be fun to share. In my daily work, I'm finding that more often than not Scrum is the de facto standard framework for dealing with complex software products. Although strictly speaking not a part of Scrum itself, the game of Planning Poker is often used to estimate feature delivery time. In it, Fibonacci numbers are used (0, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144) as an agreed-upon estimation unit (such as duration days or story points). Fibonacci numbers are used there to reflect the inherent uncertainty in estimating larger items (the longer an estimate is, the more uncertainty it contains). In the Scrum Poker app I'm using, for some reason a small variation on the Fibonacci sequence is used, namely: 0, ½, 1, 2, 3, 5, 8, 13, 20, 40, 100, a ? (unsure) and a coffee cup (I need a break) and I've read on Wikipedia that more Scrum Poker apps are doing this.

Anyway, recently I was on holiday visiting Pisa and Florence, and I came upon this statue of Fibonacci and I thought that was really cool so I took a picture. Now, most of my friends are really not interested in this sort of stuff so I was wondering who I could bother with this. Finally, it came to me :-).

How to create a private cloud step by step with System Center part 2: VMM Setup

$
0
0

In the second part of this series I will show you the basic installation of VMM, remember this is a test environment so you will need to separate SQL role and scale VMM to suit your needs.

Prerequisites

  • I always create a SCAdmin account for my labs, so I have an administration account for all my System Center Servers, you should make it local admin in your servers and your admin groups, I usually do all mi installation and update tasks with this user.
  • VMMadmins for your VMM administrators – Local admin in VMM server
  • VMMaction to manage my Hyper-V hosts – Local administrator in all Hyper-V nodes to manage.
  • VMMservice to execute VMM local services – Local administrator in the VMM server.
  • SQLservice for SQL services.

Installing SQL 2012 with Sp2:

  • It’s pretty straight forward to install SQL, so I will go through with the default I use for my test setup, in a production environment you should request advice from your DBA to follow best practices.
  • Because this is a local install you won’t need to disable your firewall or apply any rules to it.
  • Follow the setup until you get to the Setup Role page and select SQL Server Feature installation.
  • Select Database Engine and Management Tolls in the Feature Installation Page.
  • Leave the default instance for the installation to make it easy.
  • In Server Configuration  put the SQL Server Agent to Automatic and change the accounts to use your SQLservice account.
  • Check the collation is set to SQL_Latin1_General_CP1_CI_AS.
  • Add your SCadmin user to the SQL Server Administrators in the Database Engine Configuration page.
  • And press install when you are asked to.

Installing VMM 2012 R2

  • Open the install media and open setup.exe if the install page doesn’t show when you double click.
  • Click the install link.
  • Select VMM Management Server and the Console will be selected by default, click next.
  • Insert your product key and click next.
  • Follow the setup making the selections that make more sense to you.
  • In the prerequisites page you will see an error for the deployment tools not being installed, follow this link because the one in the setup will not work.
  • Run the installer and just install Deployment Tools and Windows Preinstallation Enviroment (Windows PE)
  • Close the wizard when the installation is completed.
  • Go back to the VMM setup and click the Verify Prerequisites Again button, click next.
  • In database configuration, leave the defaults and just select your default instance form the drop down combo, click next.
  • Select Domain Account and put your VMMservice credentials in DOMAIN/User format.
  • We need to create the container for the VMM key to be stored for future recovery, update or if we are doing a cluster HA for VMM installation.
    • Go to your domain controller (DC01) and run ADSI Edit.
    • Remember that the container should be created in the same domain as your install user.
    • Right click over ADSI edit in the left pane and select connect, click ok with the defaults.
    • Double click in default naming context and in your domain name.
    • Right click in your domain name and select New and Object.
    • Select Container and click next.
    • In value use VMMK and click next.
    • Click finish.
    • Right click in the new CN=VMMK created item and select properties.
    • Select security, click Advanced.
    • In the Advanced Security window click Add
    • Click Select a principal and look for your installation account, click ok.
    • Give the following permissions: Read, Create all child objects and Write all properties.
  • Now that we have our container let´s go back to our VMM server and to the setup.
  • Check the box for Store my keys in Active Directory, it should be in this format, for example, if my domain is CONTOSO.com, it should be CN=VMMK,DC=CONTOSO,DC=com, click next.
  • Leave the defaults in ports and click next.
  • Leave the defaults for the library and click next.
  • Click installs

In the next part we will prepare to manage the first host and deploy a virtual machine.

Weekend Scripter: Parse Folder of Data with PowerShell

$
0
0

Summary: Use Windows PowerShell to parse a folder that contains data dumps.

Microsoft Scripting Guy, Ed Wilson, is here. This morning I got up, and headed to the Farmers Market in town. I picked up a fresh melon for breakfast and some locally made cheese. Melon and cheese with a nice up of English Breakfast tea…not a bad meal.

I am sitting on the back porch, munching melon, sipping tea, and checking my email on my Surface Pro 3. (The cheese was pretty much gone as soon as I sliced it—not saying what happened to it, just that it did not make it to the porch.)

I decided that I need to do one more thing for my Data Manipulation Week. I am going to modify the function from yesterday (see A Function to Clean Up Data Import), so that I can pass a file or a folder full of files to it, and do the data transformation.

The situation

I have a folder that contains several data dumps. In fact, some of the data dumps are not even germane to my database. Luckily, the data dumps that I need all begin with the word DataDump. Here is a look at the DataIn folder:

Image of menu

The DataDump2.csv file contains a couple of additional records—in fact, there are a couple new errors that I did not see in my previous DataDump file (such as a capital “O” in the street name ROck Query Road. This is shown here:

Image of command output

Dude, how hard is that? Not very...

When I first thought about parsing a directory and finding all the DataDump files, I thought, well...like...I will need to write a new script.

But I do not have to do that. The reason is that I have a nice function that accepts a path to a file. So all I need to do is to collect my paths to files, and I am set. To do that, I use the Get-ChildItem cmdlet, and I specify that I want to recurse through the folder. I can use the –filter parameter to specify that I only want files that begin with the word DataDump.

When I have a collection of FileInfo objects that point to the CSV files, I want to process each of the files and run my Convert-Data function on them. Here is the command I use (gci is an alias for Get-ChildItem and % is an alias for ForEach-Object):

gci C:\DataIn -Filter datadump* -Recurse | % {Convert-Data -path $_.FullName}

That is it. I did not need to write a script to do this.

A new consolidated CSV file...

Now I need a new consolidated CSV file so that I can use it to import data into my new database. No problem, I use the previous command (I simply use the Up arrow). Then at the end, I pipe it to Export-CSV and I point it to the DataOut folder. Here is the command:

gci C:\DataIn -Filter datadump* -Recurse | % {Convert-Data -path $_.FullName} |  Export-Csv -Path C:\DataOut\DataDumpOut.csv -Append -NoTypeInformation

And here is the newly created CSV file:

Image of command output

Join me tomorrow when I will begin a discussion about tuples…you will love it.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

Microsoft IFA Trends

$
0
0


In wenigen Tagen startet die weltweit größte Messe für Unterhaltungselektronik IFA 2014. Wie in den vergangenen Jahren auch, stellen Hardware-Hersteller und Telekommunikationsunternehmen neue Geräte unterschiedlichster Formfaktoren und Preiskategorien auf Basis von Windows 8.1 und Windows Phone 8.1 vor. In diesem Jahr sehen wir folgende Trends:

Trend 1: Kleine und kostengünstige Geräte
Hardware-Partner von Microsoft bieten bereits eine große Auswahl an kleinen und kostengünstigen Geräten mit Windows 8.1. Die Devices zeichnen sich besonders durch die handliche Form in Kombination mit der vollen Windows Leistung aus und eignen sich für einen schnellen und unkomplizierten Zugriff auf Mails oder Apps sowie zum Surfen im Internet. Trotz der geringen Größe (7- bis 8-Zoll) können diese in Sachen Akkulaufzeit und Software mit den größeren Geräten mithalten. Somit sind Small Tablets mit Windows 8.1 für den Gebrauch von unterwegs oder zur handlichen Steuerung von Heimgeräten optimal geeignet. Bildmaterial zu bereits erhältlichen Small Tablets mit Windows 8.1 finden Sie hier.

Im Bereich der preislichen Einstiegsklasse stellt beispielsweise TrekStor mit dem SurfTab wintron 10.1 sein erstes Windows Tablet auf der IFA 2014 vor. Auch im Bereich Smartphones bietet Microsoft preislich attraktive Optionen. So ist das Nokia Lumia 635 eines der günstigsten Smartphones mit LTE.

Trend 2: „2-in-1“ Geräte - Surface Pro 3 auf Intel Stand zu sehen
Hybridmodelle sind mobil wie ein Tablet, aber gleichzeitig auch leistungsfähig wie ein Laptop. Mit wenigen Handgriffen lassen sie sich, zum Beispiel durch ansteckbare Tastaturen, vom einen ins andere Gerät verwandeln und sind sowohl über Touch- als auch per Tastatureingabe bedienbar. Damit eignen sich diese vor allem für private und berufliche Anwender, die ein leistungsstarkes sowie gleichzeitig portables Device benötigen, jedoch nicht zwei Geräte bei sich tragen möchten. Als eines der dünnsten Varianten dieser modernen Produktklasse ist Microsofts neues High-End Device Surface Pro 3, ausgerüstet mit der vierten Generation der Intel® Core™Prozessoren, kurz nach dem Deutschland-Start auf dem Intel Stand in Halle 16 zu sehen.

Mit dem Surface Pro 3 bringt Microsoft ein High-End Device aus diesem Segment am 28. August 2014 und somit kurz vor dem Start der IFA 2014 in Deutschland auf den Markt. Surface Pro 3 unterscheidet sich von anderen „2-in-1“ Geräten vor allem durch sein großes und hochauflösendes 12-Zoll Clear Type Full HD Plus Display, welches von der bekannten Display Evaluationsfirma DisplayMate Technologies als eines der aktuell besten Displays ausgezeichnet wurde. Microsofts neue Surface Generation bietet außerdem einen integrierten und stufenlos verstellbaren Kickstand, der eine flexible Nutzung des Gerätes auch auf dem Schoß ermöglicht. Zudem kommt Surface Pro 3 mit einer verbesserten Akkulaufzeit von bis zu neun Stunden. Nicht zuletzt ermöglicht der Surface Pen mit verbesserter Drucksensibilität digitale Notizen über eine natürliche Stifteingabe. Ausführliches Bildmaterial sowie Produktspezifikationen zu Surface Pro 3 und Zubehör finden Sie in der Pressemappe auf dem Microsoft Newsroom. Bildmaterial zu weiteren „2-in-1“ Geräten mit Windows 8.1 finden Sie hier.

Informationsquellen zu Microsoft-News auf der IFA 2014
Auf dem Microsoft Newsroom finden Sie in der IFA-Woche täglich aktualisierte Hintergrundinformationen sowie Bild- und Videomaterial rund um Microsoft und die IFA.

 

Ein Beitrag von Irene Nadler
Communications Manager Devices and Services

PowerTip: Using PowerShell to Determine if Path Is to File or Folder

$
0
0

Summary: Learn how to use Windows PowerShell to determine if a path is to a file or a folder.

Hey, Scripting Guy! Question How can I use Windows PowerShell to determine if a path is to a file or a folder?

Hey, Scripting Guy! Answer Use the Get-Item cmdlet to get the object represented by the path. Then use the –Is operator to see if the
           object is a [system.io.directoryinfo] object or a [system.io.fileinfo] object. Here is an example:

PS C:\> (Get-Item c:\fso) -is [System.IO.DirectoryInfo]

True

PS C:\> (Get-Item C:\fso\csidl.txt) -is [System.IO.DirectoryInfo]

False

“Einrichten und Ausrollen einer Private Cloud mit Microsoft Technologien” absolvieren und gewinnen!!

$
0
0

Einrichten und Ausrollen einer Private Cloud mit Microsoft Technologien

Zugegebenermaßen ist das nicht mehr der neueste Kurs auf der Microsoft Virtual Academy. Nichts desto trotz bietet dieser Kurs immer noch eine Vielzahl an interessanten Informationen. Und gewinnbringend ist der Kurs allemal, siehe Gewinnspiel am Ende dieses Beitrags!!

In diesem kostenlosen Online-Training der Microsoft Virtual Academy wird demonstriert, wie Sie eine Private Cloud planen und aufbauen. Sie lernen alles, was Sie über die Kernanwendungen für Windows Server und System Center wissen müssen, um Ihre virtualisierten und physischen Ressourcen in der Cloud verwalten zu können. Die Referenten der Microsoft Virtual Academy führen Sie Schritt-für-Schritt durch Konfiguration und Management Ihrer Cloud, zeigen Ihnen technische Details und erläutern Ihnen Beispiele dazu.

Da das Training in englischer Sprache abläuft, anbei auch die Inhaltsbeschreibung in englisch:

During the 8 modules of this specialization, you will be introduced to all the elements of building the Microsoft private cloud. You’ll learn how to optimize and deploy the private cloud starting at the infrastructure layer. You’ll also be introduced to advanced virtualization management features and the concept and implementation of the System Center’s private cloud application service model.

After completing all of the modules you will have an understanding of:

  • How using Microsoft System Center 2012 can help you build, deploy and manage a private cloud infrastructure.
  • How System Center 2012 incorporates tools to deploy, update, and manage applications within your private cloud.
  • System Center 2012’s new abilities to deploy, update, and manage applications within your private cloud.
  • How using new components of System Center 2012, specifically the Orchestrator and Service Manager components, enable you to deploy, update, and manage service offerings within your private cloud.
  • How applications are deployed and managed in the Microsoft private cloud.
  • How to use new capabilities in System Center 2012 to deploy your applications as services.

Die Inhalte der einzelnen Module:

  • 01 | Configure and Deploy Infrastructure Components
    This module explains how to build a private cloud, with specific focus on individual infrastructure components that enable this. It explains how System Center 2012 gives you a set of tools to configure and deploy infrastructure components including compute, storage, network, and cluster resources. It will show you how to use System Center to build a private cloud, discussing advanced topics such as bare-metal deployment of compute resources, optimizing storage components, and creating logical networks in the private cloud.
  • 02 | Configure and Deploy Private Cloud Infrastructure
    This module explains how enterprise data centers are evolving toward highly virtualized infrastructures. It explains how System Center 2012 gives you a set of tools to configure and deploy Infrastructure resources and manage the environment; both the underlying physical infrastructure and the virtualized infrastructure. This module will show you how to use System Center to build a private cloud that aligns with the Microsoft cloud and data center management vision, to deliver common management experiences across private and public clouds. The module illustrates how to deliver IT as a service on your terms with flexible management across your hybrid cloud environments.
  • 03 | Configure and Deploy Service Delivery & Automation
    This module will show you how to use System Center to design a private cloud that aligns with the Microsoft cloud and application management vision. This vision will deliver standardization of your service and request offerings, as well as standardizing your automation process. Further, we will illustrate Microsoft’s approach to self-service delivery. Specifically showing you examples of how to use the new portal functionality in System Center 2012, to empower your end users to request service offerings based on their role in the organization. Then we will detail how the approval and delivery of these offerings occurs.
  • 04 | Configuration & Deployment
    This module explains the concept of application services and covers the types of services available to you, as well as the basic steps needed to configure and deploy that service. You’ll learn how to use new tools in System Center 2012 to manage the underlying resources of an application using the service template model, and how to delegate control and access to those templates. We’ll also how to work with developers to optimize applications for service-level monitoring across both private and public clouds.
  • 05 | Monitor and Operate Infrastructure Components
    This module explains how to build a private cloud, with specific focus on individual infrastructure components that enable this. It explains how System Center 2012 gives you a set of tools to monitor and operate your infrastructure components, enabling you to keep your infrastructure components up and running. We further explain how to understand your infrastructure through performance monitoring, as well as how to optimize the health of your infrastructure. These practices will enable you to monitor applications deployed in your private cloud and on Windows Azure, further allowing you to increase overall data center availability.
  • 06 | Monitor and Operate Private Cloud Infrastructure
    In this module you will learn the benefits of monitoring and operating virtualized infrastructure services in System Center 2012. As well, you will understand the benefits of using services in System Center 2012. Through this module, you will see how using System Center 2012 to provision your private cloud infrastructure and applications can deliver a flexible and cost-effective infrastructure. This module will teach you the basics of deploying and upgrading your virtual environment, the fundamentals of fabric management, and private cloud operating scenarios. By learning the fundamentals of building and delegating clouds, you will be able to better provision and optimize your application services and virtualization management practices.
  • 07 | Monitor and Operate Service Delivery & Automation
    This module builds on the previous module, and further explains the advanced monitoring and operating functionality of two key System Center components, Orchestrator and Service Manager. This will show how IT Pros will gain end-to-end visibility into the automated processes that make their private clouds. Further, to ensure accurate and timely responses to end-user service requests, we will review System Center’s new service level agreement functionality. This provides the IT Pro with expedited methods to challenging operational issues such as release, incident and change management. Part of this module will be dedicated to new reporting functionality that allows the IT Pro to maintain a consistent window into the private cloud using System Center’s business intelligence and dashboard capability.
  • 08 | Application Services Management, Operation & Management
    In this module you will learn new ways to operate and manage applications and workloads using the service template paradigm. New tools in System Center 2012 will allow for improved performance and monitoring of your applications. You’ll learn how to keep your applications running healthy and how to quickly re-allocate resources to ensure they continue to be healthy in the case of unforeseen problems or new requirements. You’ll see how these new tools can radically decrease the time required to resolve problems by giving IT administrators the ability to drill down to the performance characteristics of individual virtual and physical service resources, as well as at the code level. See how System Center 2012 will provide this new level of insight into the applications your organization depends upon to do business, and how you can use that insight not just to keep those applications running smoothly, but to make them perform at a new level.

Also ein ganzer Haufen an interessanten Inhalten, die man dann auch gleich in die Praxis umsetzen kann.

Viel Spaß beim Lernern!!

Microsoft Virtual Academy

OK, jetzt aber zum Gewinnspiel: Habt Ihr Verwendung für einen 32GB Memory Stick? Ja? Dann mache ich Euch das folgende Angebot:

Alle die bis zum 14. September diesen Kurs fertiggestellt, in Ihrem MVA Profil sichtbar gemacht und uns rechtzeitig benachrichtigt haben bekommen von uns einen 32GB Memory Stick zugeschickt. Na? Ist das ein Angebot?

Gewinnspielanleitung im Detail:

    1.) Kurs “Configuring and deploying Microsoft's Private Cloud” komplett absolvieren (=100% abgeschlossen).

    2.) Komplettierte Kurse im MVA Profil sichtbar machen:

      3.) E-Mail an Gerhard.Goeschl@Microsoft.com mit dem Link auf das MVA-Profil.

      WICHTIG: Um teilzunehmen muss das E-Mail spätestens am 14. September 2014, 23:59 abgeschickt werden!!

       

      Und hier noch das Kleingedruckte (muss leider sein):

      Veranstalter ist die Microsoft Österreich GmbH. Teilnahmeberechtigt sind alle Personen mit Hauptwohnsitz in Österreich und einer österreichischen postalischen Zustelladresse. Von der Teilnahme ausgeschlossen sind Mitarbeiter von Microsoft und deren Angehörige sowie Amtsträger. Minderjährige sind teilnahmeberechtigt, wenn der gesetzliche Vertreter in die Teilnahme und in diese Teilnahmebedingungen vorher eingewilligt hat.

      Teilnahmevoraussetzung ist die Komplettierung (=100% abgeschlossen) des Kurses “Configuring and deploying Microsoft's Private Cloud” die Veröffentlichung des Kursabschlusses im MVA Profil des Teilnehmers (https://www.microsoftvirtualacademy.com/MyMVA/MyProfile.aspx) sowie die Bekanntgabe per E-Mail an Gerhard.Goeschl@Microsoft.com bis spätestens 14. September 2014 23:59. Es gilt die Datums- und Uhrzeitangabe bei Eintreffen in der Inbox von Gerhard.Goeschl@Microsoft.com. Der Rechtsweg ist ausgeschlossen

      Wenn mehr Einreichungen erfolgen als Preise vergeben werden können behalten wir uns vor die Vergabe der Gewinne anhand der Reihung der Meldung durchzuführen und gegebenenfalls den Wettbewerb frühzeitig zu beenden. Wir behalten uns dementsprechend vor, die Teilnahmebedingung gegebenenfalls zu ändern, und hier entsprechend zu veröffentlichen.

      Die Gewinner werden per E-Mail verständigt und erklären sich mit der Veröffentlichung des Vornamens und des Anfangsbuchstabens des Nachnamens auf TechNet Austria einverstanden. Der Rechtsweg ist ausgeschlossen und die Ausbezahlung in bar nicht möglich. Mit Ihrer Teilnahme stimmen Sie diesen Teilnahmebedingungen vollständig und bedingungslos zu.

      Durch das Absenden Ihrer Daten willigen Sie in die Speicherung Ihrer Daten durch die Microsoft Corporation in den USA und der Microsoft Österreich GmbH ein. Die erhobenen Daten dienen einzig der Auslosung und Benachrichtigung der Gewinner, sie werden nicht zu anderen Werbezwecken genutzt oder an Dritte weitergegeben. Diese Einwilligung können Sie jederzeit mit Wirkung für die Zukunft widerrufen. Ein Widerspruch ist an Microsoft Österreich GmbH, Am Europlatz 3, A-1120 Wien, oder an austria@microsoft.com zu richten.


      Domingo - Final de Semana Surpresa - ALM Summit Brasil 2014

      $
      0
      0

      Olá Comunidade Technet! Hoje é domingo, final de semana surpresa!!!

      Hoje trago a vocês, tudo sobre o maior evento de ALM (Application Lifecycle Management) do Brasil, o ALM Summit Brasil 2014 que aconteceu nesses dias 29 e 20 de Agosto da Microsoft Brasil, em São Paulo!

      Nesses dois dias de evento, foi mostrado as mais novas novidades sobre ALM, práticas, processos, ferramentas, cases de sucesso de empresas que aderiram a prática de ALM, painéis de discussões, com especialistas MVP (Most Valuable Professional), PFE da Microsoft, MCT (Microsoft Certified Trainer), Technet Wiki Ninjas, ALM Rangers, e entusiastas da comunidade! Foi um evento de grande sucesso e participação da comunidade!

      E como não poderia ser diferente, o Portal do Technet Wikié um dos locais que possuem mais material técnico descritivo sobre ALM, como passo a passo de instalação, dicas, integrações sobre os produtos, e muito mais.

      Nesse post, coloquei por assuntos tudo sobre esse evento, que estarei atualizando conforme os materias forem sendo publicados pelos organizadores, Microsoft e palestrante para a comunidade ter acesso!

      Sites

      Site Oficial do Evento

      Página do Evento

      Site do Organizador

      Site sobre ALM

      Site Oficial da Microsoft sobre ALM

      Portal do Technet Wiki com Artigos sobre ALM

      Apresentações

      DevOps - VSALM e System Center Um Casamento de Sucesso
      DevOps - Como Atualizar meu Ambiente para o Team Foundation Server 2013
      Dev + Testes - Como ser um bom Administrador do Team Foundation Server

      Em breve estarei postando mais...

      Palestrantes

      Adriano Bertucci, MVP
      André Dias, MVP
      Alan do Nascimento Carlos, Technet Wiki Ninja
      Carlos dos Santos, Moderador do Technet Wiki e MSDN
      Fernando Barbieri, MCSD ALM
      Giovanni Bassi, MVP
      Igor Abade, MVP
      Igor Macedo, Especialista ALM
      José Freire Neto, MCP
      Leandro Prado, PFE da Microsoft
      Márcio Sete, MVP
      Marden Menezes, Especialista Azure da Microsoft
      Mauricio Alegretti, MVP
      Ramon Durães, MVP
      Vinicius Hana, MVP
      Vinicius Moura, ALM Rangers
      William Rodriguez, Especialista ALM

      Fotos do Evento

      Álbum 01

      Espero que aproveitem esses excelente materiais disponíveis aqui no Blog e no Portal do Technet Wiki, e comecem a repensar a forma de desenvolver e manter seu aplicativo usando as práticas de ALM!!!

      Até a próxima!

      Alan Carlos
      Technet Wiki Ninja

      Divide Discovery and In-Place hold searches into smaller units

      $
      0
      0

      This blog has been in the works for longer than I would like to admit :).  Since starting a few things have changed.  For example when this was first started in the fall the maximum number of mailboxes in a Discovery Search was 5000.  Today it is 10,000.  In time the limit will probably change again or it may even disappear at some point.  For the moment however the primary focus of this blog is to present a script that will show you how to create searches for a specific number of mailboxes.

       

      Why do this?  There are two reasons.

      • If you have a search that is failing or timing out a common troubleshooting step is to try the same search on a single mailbox.  If it works then the next question is does it work with 10 mailboxes?  a 100?  500?  Once you discover a comfortable threshold you might have thousands (in some Office 365 tenants hundreds of thousands) of mailboxes that need to have the same search applied.  Creating all of those searches by hand is at best tedious and at worst impossible.  The mailbox selection dialog in the web interface shows 500 mailboxes.
      • You need to apply the search to more than 10,000 mailboxes.

      If you need to know more about the basics of Discovery Searches before reading further you can get a great overview of how to handle Discovery Searches and in-place hold from part I and II of this blog:

      http://blogs.technet.com/b/exchange/archive/2012/09/26/in-place-e-discovery-and-in-place-hold-in-the-new-exchange.aspx

       

      The cloud does have some differences from the on-premise world:

      • There is a maximum of 10,000 mailboxes per search
      • Only 2 searches can run simultaneously (though 32 can be queued)
      • In the on-prem world you can precisely measure the load on the server that is doing the Discovery Search.  In the cloud this detail is hidden.  If your search is failing due to timeouts you may have to break it into smaller pieces or try it at a time fewer administrators of other tenants are doing the same thing.  To some extent this is a little like deciding what time of day to go to the bank.  If everyone else goes to the back at the same time the Tellers (servers/bank employees) will take longer to serve you.  You might even give up and walk away (a timeout error).
      • The destination Discovery Mailboxes are capped at 50GB.

      NOTE:  Please remember that all the numbers given here are subject to change as Office 365 evolves.

       

      So how do you work with a large number of users?  Most administrators find it awkward to deal with the Office 365 web interface for adding thousands of mailboxes to a query because the selection dialog only shows 500 mailboxes.  There are a couple ways to deal with this:

      • Create Distribution Groups with up to 10,000 member in each.  You can then base the search off the distribution group's membership instead of adding the mailboxes
      • Use a script like the one I provide in this blog to add the individual mailboxes to each search

       

      The first way has the advantage that once the group is built you need only add one item to the Discovery Search you wish to run (remember the group membership must not exceed 10,000 mailboxes).  One catch is that if group membership changes, and the Discovery Search has already been started, the change of group membership is not automatically cascaded into search results.  If you want the latest membership changes accounted for you will have to restart the search.

       

      Here are a couple of scenarios an administrator might encounter:

      You have 33,000 users that need to have in-place hold instituted.  In this example you would need to create 4 distribution groups (three of 10,000 members and one of 3000 members).  You need not worry about the size of the discovery mailbox because it is not a required parameter for placing an in-place hold.

       

      You need to conduct a Discovery Search against 67,000 mailboxes.  This would mean creating a minimum of 7 distribution groups.  However, trying to do searches against thousands of mailboxes might run into the limit on the size of a Discovery Search mailbox (presently 50GB).  Avoiding the size limit of the Discovery Search mailbox might cause you to go with 500 mailboxes per Discovery Search to avoid overflowing the Discovery Mailbox.  This means creating a lot more than 7 groups.  It also means creating additional Discovery Mailboxes so no one mailbox is overwhelmed with content.

       

      To handle both of these hypothetical examples I have written a script that can create the necessary Discovery Searches and spread them over the list of Discovery Mailboxes that are supplied to the script via a TXT file:

      #

      #  Objective:

      #  Create a new Discovery Search for every $mailboxesPerSearch users.  The search

      # results are distributed across the Discovery Mailboxes specified in

      # c:\o365\DiscoveryMbxs.txt.  The distribution is round robin.  The TXT file MUST

      # contain the UPN of each Discovery Mailbox to be used and each UPN should be on its

      # own line of the file.  No field name or column heading should be used.  THERE MUST

      # BE AT LEAST ONE UPN IN c:\o365\DiscoveryMbxs.txt

      #  Each search name will be text followed by a sequential number.  For example: 

      # Search1, Search2, etc.

      #

      #  The script does not start any of the Discovery Searches it creates.

      #

      #  For Demonstration purposes this script places 3 mailboxes in each search.  You can

      # change this by altering the value of the $mailboxesPerSearch variable.

      #

      #   Load the list of discovery mailboxes into a variable.  Searches will be distributed

      #  across the Discovery mailboxes in round robin fashion.

      [array]$DiscoveryList=get-content"c:\o365\DiscoveryMbxs.txt"  

      # Must use [array] to handle the case where the text file lists only one mailbox. 

      #Without Array it gets treated as a string and fails later.

      $DiscoveryCount=$DiscoveryList.count

      $DiscoveryLoop=0

       

      #   Set the number of mailboxes that will be in each search.  Acceptable range is 1 to

      # 10000.  3 is selected for illustration purposes in a lab.

      $mailboxesPerSearch= 3                       

       

      $base_search_name="Legal-TESTsearch"

      $loop_count=0

       

      # The line below obtains the list of mailboxes to be searched.

      $mbxs=get-mailbox-resultsizeunlimited-RecipientTypeDetailsUserMailbox       

              # For additional filtering options see the help for get-mailbox

              # http://technet.microsoft.com/en-us/library/bb123685(v=exchg.150).aspx and for

              # where-object http://technet.microsoft.com/en-us/library/ee177028.aspx

       

      $MbxRemaining=$mbxs.count

      $low_bound=0

       

      while ($MbxRemaining-gt0 ) {

          #   This loop creates one search per loop and decrements $MbxRemaining by the

          # number of mailboxes included in the search.

       

          $loop_count+=1

         

          $this_search=$base_search_name+$loop_count

          write-host"creating search: "$this_search-foregroundcolorcyan

       

          if($MbxRemaining-gt$mailboxesPerSearch) {

            #  There are at least $mailboxesPerSearch mailboxes left. 

            # This IF branch fills $search_list with $mailboxesPerSearch mailboxes.

       

            # Clear $search_list

            [array]$search_list=$NULL

            # Place mailboxes from the current value of low_bound to low_bound plus $mailboxesPerSearch

            #minus one in $search_list

            $mbxs[$low_bound..($mailboxesPerSearch*$loop_count-1)]|foreach {

               [array]$search_list=$search_list+[string]$_.UserPrincipalName

            }

           } else {

              # This branch fills $search_list with the mailboxes that are left when

              # there are less than $mailboxesPerSearch remaining.

       

              # Clear $search_list

              [array]$search_list=$NULL

              #Place mailboxes from the current value of low_bound to the end of the array in $search_list

              $mbxs[$low_bound..($mbxs.count -1)]|foreach {

               [array]$search_list=$search_list+[string]$_.UserPrincipalName

            }

          }

       

          #To create an In-Place Hold customize the line below to suit your needs

          #New-MailboxSearch -name $this_search -SourceMailboxes $search_list -InPlaceHoldEnabled $true -ItemHoldPeriod 730

       

          #To create an ordinary Discovery Search customize the line below to meet your needs

          New-MailboxSearch-name$this_search-SourceMailboxes$search_list-TargetMailbox$DiscoveryList[$DiscoveryLoop]-StartDate"12/31/2012"-SearchQuery"'10-K' OR '10K' OR '10k' OR '10k' OR 'annual report' OR 'fire' NEAR(15) ('insur*' OR 'pay*' OR 'claim*') OR 'fire' NEAR(20) 'insurance'"

       

          $DiscoveryLoop+=1

          if($DiscoveryLoop-ge$DiscoveryCount) { 

            #   If we have looped through all available Discovery Mailboxes reset the counter 

            # to the first element of the array.

            $DiscoveryLoop=0

          }

          $low_bound=$mailboxesPerSearch*$loop_count

          $MbxRemaining-=$mailboxesPerSearch

      }

       

      If you are using the script above to place mailboxes on In-Place Hold I would suggest that you select a name the will cause the search to appear at the bottom of the list of searches.  The reason I suggest this is that the Search placing the mailbox on In-Place hold might be around a long time.  Prefixing the name with “z” or another character further down in the sort order will place all the In-Place Hold searches at the bottom of the list when you sort by name.

       

      Some Discovery Search troubleshooting tips:

      • Include fewer mailboxes in each search (modify the $mailboxesPerSearch variable in the script)
      • Tighten the criteria applied to the get-mailbox cmdlet the script runs to make sure you are only including the mailboxes that truly need to be searched
      • Cover a smaller date range in each search.  If you searched without a date range and encountered a timeout or other failure read the error message carefully for clues.  There have been instances in the past where specifying a start and end date allowed failing queries to complete
      • Search during times when demands placed on the search servers are likely to be lower (this will vary by the region in which your mailboxes are hosted)
      • Test skipping the "Include unsearchable items" and "enable de-duplication" options.  This will reduce the demands of your search and allow it to run faster.  If this test works try breaking the search into smaller units and run them individually with the required options.  If it still doesn't work you may need to discuss the search with Office 365 support.
      • If you are exporting to PST make sure you have .Net 4.5 and its latest updates installed.

       

      Above I mentioned scheduling your searches as a possible option.  If you want or need to run your search during a specific time window it is best to create a scheduled task to execute your PS1 file.  The PS1 file then logs on to the Office 365 environment and executes your commands.  Here is a sample script that will log on and start a pair of searches:

      # Specify username and password.  The Password is stored in an encrypted text file in this example

      $EXOAdmUser = “YourAccount@YourDomain.onmicrosoft.com” 

      $pwd= Get-Content "c:\o365\password.txt" | ConvertTo-SecureString

      $O365cred = New-Object System.Management.Automation.PSCredential $EXOAdmUser, $pwd

       

      # Connect to Office 365

      $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $O365Cred -Authentication Basic -AllowRedirection

      Import-PSSession $session 

      Start-MailboxSearch -identity "DiscoverySearch1" -force

      Start-MailboxSearch -identity “DiscoverySearch2” -force

      Exit

       

       

      To create password.txt follow these steps:

      • Log in to office 365 from a PowerShell session using the account that you will be using to create and execute Discovery Searches.  You can use these lines to do so:
        $O365Cred = (Get-Credential)
        $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $O365Cred -Authentication Basic -AllowRedirection
        Import-PSSession $session

      • Run this command:  $o365cred.Password | ConvertFrom-SecureString | Set-Content c:\o365\password.txt

      • Make sure password.txt is created.

       

      To log in at the time you wish and start your script you would need to create a Scheduled Task.  In Scheduled Tasks you would use a command line like powershell.exe -Noninteractive -Noprofile -Command “&{<full path to your script>}”.

       

       

       

       

       

      Community Win - VENCEDORES do TechNet Wiki Day de Julho/2014

      $
      0
      0


      No mês de Julho, diversos artigos foram criados e muitos destes foram avaliados para o prêmio TechNet Wiki Day.

      Isso é um grande orgulho para todos nós, porque nossa Comunidade permanece em um movimento crescente!

      Além do aumento na quantidade de artigos criados, estamos valorizando cada vez mais os artigos que respeitam e compreendem os padrões indicados no artigo Wiki: User Experience Guidelines, como:

      • Separar o conteúdo em seções;
      • Adicionar "TOC" (Tabela de Conteúdo) em artigos extensos;
      • Incluir "Referências" de Consulta externas (Blogs, KBs ou Documentos Oficiais MS);
      • Incluir "Veja Também" para artigos do TNWiki com assuntos relacionados ao seu artigo;
      • Fornecer os devidos Créditos, quando influenciado por alguém para produzir seu conteúdo, entre outros;

      Isto é uma orientação que o Conselho da Comunidade procura incentivar para todos os prêmios Internacionais da Comunidade, inclusive o TechNet Wiki Day.

      Este cuidado especial em organizar seu conteúdo, somado com uma boa solução para um produto Microsoft determinou os vencedores deste mês.

      A votação realizada pela Comissão de Seleção de Artigos foi muito aproximada e pequenos detalhes separam estes artigos do maior prêmio da nossa Comunidade. 

      Todos os 5 artigos relacionados merecem o prêmio pelo auxílio de seu conteúdo para nossa Comunidade. 

      Com satisfação apresentamos abaixo os artigos vencedores deste mês:

       OuroConfigurando Microsoft Azure Site Recovery por Fernando Lugão Veltem
       PrataChat Persistente - Criando Categorias e salas no Lync Server 2013 por Fabio Souza
       BronzeDefinir "Senha Nunca Expira" no Office 365 por Vinicius Mozart

      Também foram lembrados os seguintes artigos entre os melhores da Comunidade neste mês de Julho:

      Lync Edge - Federação não funciona corretamente - Unable to resolve DNS SRV record Error ID 404 e 504 por Carlos F.da Silva
      Expandindo disco VHD - Hyper-V por Thiago Guirotto

      Parabéns à todos os artigos vencedores e seus autores.

      artigo medalha de OuroConfigurando Microsoft Azure Site Recovery, também será divulgado no Blog Wiki Ninjas Internacional, passará à ser reconhecido pela tag TechNet Wiki Day WINNER.

       

      Parabéns Fernando Lugão Veltem e obrigado por contribuir com nossa Comunidade Microsoft TechNet Wiki.

      Até +,

      Wiki Ninja Durval Ramos ( BlogTwitterWikiPerfil )

      New tools and services available for educators, now!

      $
      0
      0

      We are very excited to announce that we have now added Phonics supports to our Chekhov cloud services and tools – allowing early-grade teachers (and teachers of English as a foreign language) to create free eBook resources which will read themselves aloud to learners on any Windows or Windows Phone device.  Anyone, anywhere can access these Azure-based services and create a new eBook in a matter of minutes. Every book they create can be published as a Windows 8 App, a Windows Phone App and as a free print-on-demand PowerPoint file. Educators (and students) can add their text, illustrations/photos and recorded audio - which will now include narrated audio for the phonemes which construct each word, which as we know is of huge educational value to beginning readers.


      • Download a sample App from Diana Sharp hereand encourage your fellow educators to access the free Chekhov tools at the www.lit4life.net website.
      • For a short video tutorial showing how to use the Chekhov tools to create your own eBooks go here.
      • You can also create an eBook directly from your Windows Phone! Simply download the Chekhov Story Author Apphere.

      Evento de la comunidad - SQL Saturday Barcelona 2014

      $
      0
      0

      El grupo de miembros de PASS (Professional Association for SQL Server) organiza el próximo 25 de Octubre 2014 en Barcelona la primera edición en España del SQLSaturday, un evento totalmente gratuito enfocado a los profesionales y futuros profesionales que trabajan o están interesados en SQL Server, Big Data y Business Intelligence.

      El evento contará con más de 20 sesiones en inglés o castellano que cubrirán los siguientes temas entre otros:

      • Motor relacional (optimización, bloqueos, índices...)
      • Alta disponibilidad
      • SQL Server 2014
      • Big Data (HDinsight, Pig, Hive...)
      • Power BI
      • Integration Services, Analysis Services, Reporting
      • Powershell
      • Y muchos más

      El elenco de ponentes está formado por MVPs, MCM's, trabajadores de Microsoft y técnicos especialistas de renombre. Si estás interesado en conocer en más detalle las sesiones que han sido enviadas están disponibles para su consulta hasta que se publique la agenda definitiva. Actualmente hay 127 sesiones propuestas y más de 180 registrados.

      Si asistes al TechEd Europe que se celebra esa misma semana en Barcelona es una oportunidad única de disfrutar de los ponentes de Microsoft Corp que asistirán. No solo eso, también podrás llevarte alguno de los premios que estarán disponibles para los asistentes como entradas para el Teched Europe Barcelona, Cascos BEATS o alguno de los libros técnicos de la editorial Wrox que se repartirán.

      ¡Regístrate
      aquí!

      Plazas limitadas
      Cuándo Sábado, 25 Oct 2014
      9:00 AM - 7:00 PM
      Dónde IQS, Via Augusta 390
      Barcelona

      Viewing all 34890 articles
      Browse latest View live


      <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>