Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

Support Tip: Switch O365 Portal Views When Setting Up APNs Certificate

$
0
0

Customer experience is important to us so we want to make you aware of an issue with the setup of APNs certificate on the new O365 Admin Portal and the workaround solution for it.

The link for setting up an APNS certificate for iOS devices on the new O365 Admin Portal brings users back to the same page instead of guiding them through the set up process. The screenshot below shows the setup link (underlined in red) that has the issue.

APNSBlog1

To workaround this problem, switch to the old Office 365 Admin Portal. The link to set up an APNS Certificate works from there as expected. The screenshot below shows the link (underlined in red) to switch to the old portal.

APNSblog2

We are working to resolve this problem and will update this post once the issue is resolved.

Hitaskhi Nanavaty

Intune Customer Experience Team


(SQL) Tip of the Day: Import AdventureWorks to SQL Azure

$
0
0

Today’s Tip…

Steps to import AdventureWorks database to SQL Azure :-

New portal has sample database to import but seems like it has only few tables. Steps to use the sample database from portal are here – https://blogs.msdn.microsoft.com/kaevans/2015/03/06/adventure-works-for-azure-sql-database/

currently outlined method is not working from the codeplex link. Below are the modifications done to make it work.

AdventureWorks2012database for SQL Azure :-

1.  Download AdventureWorks2012database for SQL Azure from the link http://msftdbprodsamples.codeplex.com/releases/view/37304 to  your local machine

2. Go to folder where AdventureWorks downloaded

clip_image001

3. Open SSMS 2016, Right click server node and execute CreateAdventureWorksForSQLAzure_DB

4. Once DB created, Right click on created database and execute script AdventureWorks2012ForSQLAzure_Schema shown in screen shot

5. Once schema created, go to directory where CreateAdventureWorksForSqlAzure.cmd located and open the cmd file in notepad and comment first few lines with “#” like below and save it

clip_image002

6.       Populate data by opening saved command prompt (Run as administrator) to directory cmd file located and run like below

CreateAdventureWorksForSqlAzure.cmd “servername” “sarita@servername” “pwd”

(Please replace it as your server name, login and password)

7. Once it copies the data, check the tables in SSMS

Exchange ブログ 2016 年 6 月のまとめ

$
0
0

2016 年 6 月の Exchange ブログをまとめてみました。あなたが見逃しているかもしれないブログも一覧でご覧になることができますので、この機会にぜひご覧ください。

•6 月 6 日: Exchange 2010 SP3 RU13 適用による OWA のハイパーリンクに関する問題 (Exchange Server Support)
•6 月 28 日: System Center Operations Manager で Exchange Server 2016 を監視する (Japan Office Official Blog)
•6 月 28 日: 四半期ごとの Exchange の更新: 2016 年 6 月の更新プログラムをリリース (Japan Office Official Blog)
•6 月 30 日: トランザクション ログが削除されずにディスク領域が圧迫されたときの対処方法 (Exchange Server Support)

Restore-DatabaseAvailabilityGroup 実行時の 0x46 エラーについて

$
0
0

災害対策を目的として、異なる Active Directory サイト間で DAG を構成しているお客様も多くいらっしゃるかと思います。
仮にプライマリー サイトで災害が発生し、セカンダリー サイトにデータベースを切り替えて運用する必要が出てきたとします。
DAC モード (データセンターのアクティブ化調整モード) を設定している場合、Stop-DatabaseAvailabilityGroup コマンドでプライマリー サイトのメールボックス サーバーを停止としてマークし、Restore-DatabaseAvailabilityGroup コマンドで停止としてマークしたメールボックス サーバーをクラスターから除外して、セカンダリー サイトでデータベースを稼働させることができます。
 
今回のブログではこの Restore-DatabaseAvailabilityGroup コマンド実行時に発生するエラーについてご紹介します。
Restore-DatabaseAvailabilityGroup コマンドを実行すると、内部的には ForceQuorum が実行され、クラスター サービスを強制的に再起動させます。
そして、Stop-DatabaseAvailabilityGroup コマンドにて停止としてマークしたメールボックス サーバー除外し、クォーラムを再構成して、セカンダリー サイトで DAG を稼働します。
 
しかしながら、このコマンドで実行される一部の処理は独立しており、他の処理の終了を待機しない場合があります。つまり、セカンダリ サイトのメールボックス サーバーのクラスタ サービスが起動途中の段階で次のノードの除外が実行される可能性があり、このような場合、以下のような 0x46 のエラーが発生し Restore-DatabaseAvailabilityGroup コマンドの実行に失敗します。
 
+++++++++++
An error occurred while attempting a cluster operation. Error: Cluster API failed: “GetClusterKey() failed with 0x46. Eror: リモート サーバーは一時停止されているか、起動途中です。
+++++++++++
 
エラー コード 0x46 は ERROR_SHARING_PAUSED というエラーで、サービスが停止しているか、または 起動途中の状態であるために失敗したことを示すものです。
過去のお問合せでは Exchange 2010 および Exchange 2013 でこのエラーの発生が報告されています。
クラスター サービスの起動状況に依存して発生するものですので必ず発生するわけではありませんが、発生した場合は再度 Restore-DatabaseAvailabilityGroup コマンドを実行することで対処することが可能です。
 
また、同様に Start-DatabaseAvailabilityGroup コマンド実行時にも、コマンドの内部処理の実行タイミングに依存して、0x13af のエラーとなり失敗する場合があります。
こちらの事象については以下の KB でもご紹介しておりますので併せてご確認いただければと思いますが、対処方法は Restore-DatabaseAvailabilityGroup コマンドの場合と同様に、再度 Start-DatabaseAvailabilityGroup コマンドを実行します。
 
 Title : Start-DatabaseAvailabilityGroup コマンド レットによる データセンターの再アクティブ化について
 URL : https://support.microsoft.com/ja-jp/kb/2696240
 
その他、以下でもサイト間での切り替え時にデータベースをマウントすることができない事象についてご紹介させていただいておりますので、こちらもご確認いただけますと幸いです。
 
 Title : Exchange 2010 DAC モードの DAG について
 URL : https://blogs.technet.microsoft.com/exchangeteamjp/2012/07/02/exchange-2010-dac-dag/
 
今後とも当ブログをよろしくお願いいたします。
 

※本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

Office 365 のセキュリティおよびコンプライアンスに関する 5 月の最新情報

$
0
0

(この記事は 2016 年 5 月 11 日に Office Blogs に投稿された記事 May Office 365 security and compliance update の翻訳です。最新情報については、翻訳元の記事をご参照ください。)

 

Office 365 チームは 1 か月間にわたり、新しいセキュリティ機能を継続的に導入してきたほか、クラウド セキュリティの最高水準を維持していることを証明する各国の認定を取得しました。

今回の記事では、ここ数週間の主なニュースをまとめてご紹介します。

Office 365 の電子メールの安全性に関するヒント: 今日のスパムやマルウェアによる攻撃は非常に巧妙になっているため、ユーザーは正常なメールだと勘違いしてしまうことがあります。メッセージを [迷惑メール] フォルダーに振り分けるだけでは、対策として不十分です。そこで、今後数週間にわたって Exchange Online Protection の Safety Tip をロールアウトします。この機能は、疑わしいとマークされたメールに対して警告を表示し、安全な場合にはその旨を通知します。

 

Office 365 インポート サービスの提供地域拡大と新機能の追加: このたび、Office 365 インポート サービスの一般提供が開始されました。提供地域が拡大されると共に、Office 365 へのデータのインポートをさらに容易にする新機能が追加されています。

 

時間のかかる電子情報開示検索タスクの自動化: 電子情報開示や調査を行う場合、基になるデータや検索の充実度と品質を確認するためには、探索検索を迅速に作成してレポートを作成することが重要になります。これを支援するために、Office 365 セキュリティ/コンプライアンス センターでは、時間のかかるコンテンツ検索タスクを自動化する Windows PowerShell コマンドレットが提供されています。

 

Yammer が Office 365 の高度なコンプライアンスに準拠: 先日、Yammer が ISO 27001 および SSAE 16 を含む業界最先端のセキュリティおよびコンプライアンス基準に準拠したことを発表しました。現在、各基準のコンプライアンス レポートは Office 365 Service Trust Portal で公開されており、お客様は独自の規制リスク評価を簡単に実施することができます。Office 365 でサポートされる業界標準および業界規制の詳細については、Office 365 コンプライアンス フレームワーク (英語) をご覧ください。

 

マイクロソフトによる Office 365 の機密データの監視と保護 (英語): マイクロソフトの IT 部門は先日、Office 365 のデータ損失防止ソリューションを活用して、機密データ共有のリスクを抑制しつつ共同作業を促進する方法を説明するケース スタディを公開しました。

 

Office 365 がクラウド セキュリティ ゴールド マークを取得 (英語): クラウド セキュリティ マーク (CS マーク) は、日本初のクラウド サービス プロバイダー (CSP) 向けのセキュリティ基準で、情報セキュリティ対策に関する国際的な実施基準、ISO/IEC 27017 に基づいています。CS マークを取得したことにより、お客様は Office 365 の情報セキュリティ対策の運用上の透明性と可視性を確認し、データのセキュリティおよび機密性に関する一般的な懸念を払拭できます。CS マークの認定は、日本セキュリティ監査協会 (JASA) が行っています。JASA が策定した情報セキュリティ監査制度 (AISAS) は、情報/物理的/開発セキュリティ組織、人事部におけるセキュリティ、ビジネス継続性/障害復旧/インシデント管理といった分野にわたる約 1,500 のセキュリティ対策の監査を規定するものです。JASA 公認の監査機関による厳しい審査の結果、Office 365 は CS ゴールド マークを取得した初のクラウド サービス プロバイダーとなりました。

 

Office 365 と Azure がハイパースケール クラウド サービス プロバイダーとして初めてスペインの国家セキュリティ フレームワークに準拠 (英語): このフレームワークは、利用するサービス プロバイダーについて、政府機関が対応および要求する必要のある主要な方針と必須要件について規定したもので、可用性、信頼性、整合性、機密性、追跡可能性といったセキュリティの区分ごとに特定のセキュリティ対策が指定されています。Microsoft Azure と Microsoft Office 365 は、独立監査機関 BDO による厳格な審査を経て、このフレームワークへの準拠を証明する公式報告書の発行を受けました。BDO の報告によると、Azure と Office 365 のセキュリティ対策と情報システムおよびデータ処理施設は (スペインの国家セキュリティ フレームワークの基盤となる) 国王令 3/2010 に高いレベルで準拠しており、是正措置を講じる必要がないことが認められました。

 

クラウドへの移行が、セキュリティの強化につながる

現在、多くのお客様も実感されているように、クラウドには、セキュリティ環境の変化に常に対応できるという、固有のメリットがあります。さらに、自社の IT サービスを管理する場合とは根本的に異なる、クラウドならではの特徴が多数見られます。これはセキュリティに限った話ではないものの、セキュリティ面でも重要な意義やメリットがあることは確かです。

クラウド セキュリティの概要については、次のビデオをご覧ください。

 

※ 本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

Enabling remote storage access in Remote Desktop Connection

$
0
0

By Michael Sammels

This article came to me recently when I was connecting to my desktop via Remote Desktop Connection (RDC). I had an audio CD in the drive of the desktop and was a bit confused when I discovered that I was unable to fully access it. I was able to open the disk and browse the file list, but I wasn’t able to open any files. I discovered this also applied to other devices, such as SD cards and flash drives. For those who would like or need to access files on a storage device there is a way to enable this over Remote Desktop Connection. You need to edit the option via Group Policy on the host machine, but you can do this on the RDC and the change is instant.

Step-by-step

The first step is to load up the Group Policy Editor. Please note: you need to be using Pro or above. The fastest way to open the Group Policy Editor is to press Win Key + R to open up the Run menu, and type in gpedit.msc:

1

And this will open up the Group Policy Editor. If you are on a domain, or you are not an Administrator on the machine you are trying to edit, you will (most likely) run in to the following problem:

2

Unfortunately, this will mean you will not be able to continue with this guide. If you are able to get past all of these pesky little errors, you should be presented with the standard Group Policy Screen, which looks like so:

3

The above screenshot is from a machine running Windows 8.1 – as I do not have permission to load up Group Policy Editor on my current machine. Don’t worry though – the steps are exactly the same. For the remainder of this article, I will be unable to provide screenshots, so I will try to make everything as clear as possible. If you need some more clarification, please feel free to comment.

If you have used the Registry Editor in the past, you will notice this is laid out in a similar manner. There are hierarchical folders on the left, with options on the right. Navigate as so:

Computer Configuration -> Administrative Templates -> System -> Removable Storage Access

Click on Removable Storage Access and you will be presented with a plethora of options. You are looking to find (in the right hand pane) and enable the following policy:

All Removable Storage: Allow direct access in remote sessions

Enabling this will allow you to access USB flash drives, CD discs, and whatever else takes your fancy. In order to enable it, simply double-click it and check the radio box that says “Enable”, before going through the standard steps of Apply and Okay.

If you enable this policy setting, remove users will be able to open direct handles to removable storage devices in remote sessions.

As stated earlier, this change is instantaneous, does not require a restart and can be done while remotely connected to the PC – which makes it very handy to turn on and off.

 

Resources

How to establish remote desktop connection – howtoconnect.com

Counting down to #WPC16! Are you prepared?

$
0
0

We’re incredibly excited to welcome you, and the world back to Toronto, and Canada, for this year’s Worldwide Partner Conference! On behalf of the entire Microsoft Canada team, we hope next week proves to be a great opportunity to reconnect with old colleagues and friends, while also making new connections to help you grow your business.

Before the event, be sure to:

Download the #WPC16 mobile app

The WPC 2016 mobile app integrates with your registration info and WPC Connect profile to bring the WPC experience to your mobile device. Build your personalized schedule, access conference maps, submit session evaluations, and keep the conference agenda in your pocket at all times.

Confirm your onsite meeting times

Connect with your colleagues before the conference, and make sure that everyone in your meetings has the same time and location on their schedules. If you’ve scheduled meetings through the WPC Connect Meeting Scheduler, make sure your attendees have accepted. Note that scheduled meetings that do not have accepted attendees are automatically canceled to free up tables for others to use.

Download the Conference Guide and the Know Before You Go document

The Canadian Conference Guide and the Know Before You Go guide are excellent resources for conference information, including transportation details, restaurant and dining options, Wi-Fi passwords, and building maps.

Get to know Toronto

The city is full of excellent options for evening exploration, refreshments, and relaxation. Visit the WPC Logistics page for details on deals and discounts. You can also learn more on what to do in Toronto from our “Is it your First Time in Toronto” or “Top 10 things to do in Toronto” blog posts.

View this interactive Bing map of the important WPC locations in Toronto and local restaurant options.

Map

Once you’re with us in Toronto, you’ll want to:

Enjoy the Canadian Events

Canadian Keynote & Welcome Reception:
July 10th – 4:30pm to 7:30pm, at the Roy Thomson Hall

IMPACT Awards Dinner*:
July 10th – 7:30pm to 9:30pm

IAMCP Canadian Meet-Up:
July 11th – 9:00pm, at Marben,

Canadian Regional Party:
July 12th – 9:00pm to 12:00am, at the Royal Ontario Museum (ROM)

*By invitation only.

Visit the Canadian Lounge

Make sure you’re getting the most out of your WPC meetings with other Partners. The Canada lounge is a great place to meet other partners, relax between sessions, and network!

Lounge hours:
Monday 11:00 AM – 6:00 PM
Tuesday 11:00 AM – 6:00 PM
Wednesday 11:00 AM – 6:00 PM
Thursday 8:00 AM – 1:00 PM

Lounge location:
The Commons, MTCC (in the Regional Lounges area)

Registration and Badge Pick-up

Please check in at any of the locations below upon your arrival to the airport or at the registration desks at the Metro Toronto Convention Centre (MTCC). For more information, check the conference guide.

Toronto Pearson International Airport – Arrivals, Terminal 1 and 3:
Saturday, July 9 and Sunday, July 10th – 8:00AM to 8:00PM

Metro Toronto Convention Centre – Level 600, South Building:
Sunday, July 10: 8:00AM to 8:00PM
Monday, July 11: 7:00AM to 6:00PM
Tuesday, July 12: 7:30AM to 6:00PM
Wednesday, July 13: 7:30AM to 5:00PM
Thursday, July 14: 8:00AM to 11:00AM

Visit the Community Hub

Head to the Community Hub to engage with a range of partner communities, such as the International Association of Microsoft Channel Partners (IAMCP) and Women in Technology (WIT) communities, and find new ways to get involved in the Microsoft partner ecosystem.

Join the Conversation

Follow us on Twitter to get real-time updates – and don’t forget to add your voice by tweeting about your experience using #WPC16. If this is your first WPC, then make sure you hashtag #New2WPC! If you see the social team at any of the events, please feel free to say Hi! See you soon!

Figuring out which client was used by the sender to send an email

$
0
0

I have had several customer requests in the past who wanted to figure out what client was used to send an email from a specific mailbox in their Exchange environment (eg: want to figureout whether OWA/Outlook/ActiveSync/EWS etc. was used to send an email)

The simplest and the easiest way is to look through your Message Tracking logs.

In Exchange 2010, we introduced this field in the message tracking logs called as original client IP address.

This captures the machine IP that was used to send an email. This does not show up in your Get-MessagetrackingLogs commandlet by default on Exchange 2010.

So, you will have to manually get into the message tracking log file (based on the time the email was sent) from the mailbox server and look for the message ID for the email being sent. The message ID is something that could be found from both your Get-MessageTrackingLog as well as the message header from the recipients mailbox.

MessageTrackingLog

In the above screenshot of an email sent from one of the user using Outlook when you run the commandlet Get-MessageTrackingLog using EventType as Submit, we see that it did mention that the client used was MOMT (look at the SourceContext field).

ClientType:MOMT

Here are the various Client Types and their meaning.

ClientTypeDefinition

MOMT – expands to Messsaging On Middle Tier (means Outlook client).

If you are using OWA to send an email, you would notice ClientType as OWA

Now that we know what type of client is being used, how do we know what was the IP address from which the email was submitted?

Well, in Exchange 2010 there is no simple way like Get-MessageTrackingLog and you will have to depend on searching through Raw message tracking logs on the mailbox server on which the users mailbox database is mounted  to find out which Transport server the Email was submitted from the Mail submission part of the mailbox server.

If you are having more than one multi-role servers in the same AD site, the submission service will always try to submit the email to the transport service other than itself preferably. Only in case where it couldn’t submit an email to other servers, it will try to submit to the transport service locally on the server.

So, go to the Exchange install directory (you can get that by looking at the inbuilt variable $exinstall from your Exchange management shell)

ExplorerExInstall

Under V14, go to TransportRolesLogsMessageTracking

You will notice a large number of Tracking log files under MessageTracking folder.

RawTrackingFile

But as you can notice, since the above screenshot is from my lab, you do not notice a large number of files here.

Look for the file name MSGTRKM********-*.log around the time that the message was sent.

Note – The file name is MSGTRKM and not MSGTRK.

MSGTRKM means a message tracking log generated by the submission part of email from the mailbox server and MSGTRK logs are the ones contain the further information about what happened to the email when the message is in transport pipeline.

You can copy the MSGTRKM log files to a machine which contains Excel and filter it out for the MessageID that you obtained from the above Get-MessagTrackingLog command.

You will notice the Hub Transport Server the message was given to under the Server-Host-Name Header. Once you obtain that, you open up the message tracking logs generated during the time of this email submission from that particular Transport server (obtained under Server-Host-Name header) and then filter that for MessageID again.

Also filter the obtained results for Event-id “Receive

You must now notice only one line as mentioned in the screenshot below.

EventIDAndMessageIDFiltered

Here, if you scroll right at look at the column “original-client-ip“, you notice the IP address of the client machine from which the client submitted the email.

In Exchange 2013, this task is much simplified as we have introduced the OriginalClientIP address attribute in the Get-MessageTrackingLog commandlet.

2013Tracking2

Now, if you are wondering why you aren’t seeing the correct IP of the client machine, there could be several reasons behind that. Few of them are listed below:

  1. You have a Load balancer which masks the client IP address and stamps the connection coming to the server with its own floating IP address.
  2. You have client coming from internet and the client comes through a firewall/Proxy server which masks the IP address of the client machine.

If there is a user account which gets compromised and sends out large number of spam emails, you can now quickly figure out what type of client is being used to send emails out and take actions against such clients for that user.

For more information on the fields in Message Tracking Logs, please refer: https://technet.microsoft.com/en-us/library/bb124375(v=exchg.160).aspx


(SQL) Tip of the Day: Improving Azure DW Insert Performance

$
0
0

Today’s Tip…

We have had a few cases recently with users concerned with their write performance on Azure DW. What we have found is that the backend distribution databases are designed for fast reading but can have some slower inserts. TempDB for Azure DW though uses very write efficient storage.

What we have found is that if you insert into a temp table first and then harden to your local table you will get significantly better write speeds and hopefully have a great experience with Azure DW!

Certifying on Windows Server 2016

$
0
0

BannerCertifyingOnWindowsServer2016

Windows Server 2016 has yet to launch, but that doesn’t stop the guys over at Microsoft Learning from getting ready to train and certify you on the new platform.

Just last month we began to see the first courses become public in preparation for MCSA: Windows Server 2016. All three courses are currently ‘In development’ with a planned publication date set for 13 September 2016.

Before we dive in to the detail, first a quick disclaimer – the course is still in development along with Windows Server 2016 on Technical Preview 5, expect this information to change in the coming months.

Now that’s out the way, after scouring the pages to extract all the information possible, here’s the key things you need to know:

A look at MCSA: Windows Server 2016

The MCSA pathway to Windows Server 2016 looks set to follow the same structure as its predecessor: three MOCs and three exams.

Microsoft Learning have so far published three Windows Server 2016 courses, and, while we are yet to see exams emerge, they are mentioned in the course overviews bearing the equivalent 70- prefix. So let’s take a look at the three courses:

Course 20740A: Installation, Storage, and Compute with Windows Server 2016

According to Microsoft, this course is designed for professionals managing storage and compute in Windows Server 2016, and who need to understand the requirements, scenarios, storage and compute options available and applicable to Windows Server 2016. Not to mention preparing you to sit the certification exam 70-740: Installation, Storage and Compute with Windows Server 2016.

Basically, it’s for Windows Server admins and IT pros looking to learn the critical skills and knowledge to get the most from storage and compute features available on the new platform. It also covers all installation and configuration processes for Windows Server 2016 – including a focus on Nano Server. Before you ask, yes, it looks like there’s material on migration from past Server platforms.

There’s a large focus on Hyper-V and configuration of Virtual Machines, with a section dedicated to installing, managing and configuring the newly launched Hyper-V containers – an alternative OS and machine virtualisation offering in a lightweight configuration to reduce performance hits. There’s sections on failover clustering, network balancing and it looks like you’ll even get to grips with high availability and disaster recovery tech in Windows Server 2016.

Course 20741A: Networking with Windows Server 2016

This course is all about developing the fundamental networking skills to deploy Windows Server 2016. You’ll cover the platform’s in built networking technologies as core networking principles. This includes remote access technologies, IP Fundamentals and advanced features such as Software Defined Networking (SDN). The course has the secondary function of preparing you for exam 70-741: Networking with Windows Server 2016.

If you’re a network, system or infrastructure admin looking to implement core and advanced networking features in Windows Server 2016, this one’s for you. Looking closer, you’ll learn to plan and implement IPv4 and IPv6 networks, DNS and IPAM features. You’ll also learn all about remote access, Direct Access and VPNs – allowing company wide access to resource.

The course also covers core features allowing you to unlock the power of network virtualisation using Windows Server 2016. You’d cover Software Defined Networking and the implementation of advance network features, including Hyper-V virtual switches and advanced Hyper-V networking features.

Course 20742A: Identity with Windows Server 2016

The third and final course is for AD DS, system or infrastructure Admins. Focusing specifically on developing the skills to use the full capabilities of Active Directory in Windows Server 2016, whist also preparing you for exam 70-742: Identity with Windows Server 2016.

You’ll look at configuring and deploying Active Directory Domain Services (AD DS), implementing Group Policy and performing back-up and restore processes. There is also a focus on additional server roles with Active Directory Federation Services (AD FS) and Active Directory Certificate Services (AD CS).

For the cloud embracing AD DS specialists out there, the course has a focus on integration with Azure. You’ll learn to implement synchronisation between Windows Server 2016 AD DS and Azure AD.

When can I sit the MCSA Windows Server certification?

In short, as soon as both the exams and course are published. We know course publication is expected 13th September 2016, however, without the accompanying exam pages, I’m still somewhat in the dark.

Best guess at this stage is October 2016. If you’re interested in being on the first course, whenever that may be, Firebrand are planning to be the first offering an accelerated MCSA: Windows Server 2016 course in just 9 days.

 

Resources

What’s new in Windows Server 2016 – Microsoft Virtual Academy

Course 20740A: Installation, storage and compute with Windows Server 2016

Course 20741A: Networking with Windows Server 2016

Course 20742A: Identity with Windows Server 2016

 

Ed JonesBy Ed Jones. Ed works for Firebrand Training, a Microsoft Gold Learning Partner. He has worked in the IT training and certification industry for the past 4 years. He is a tech enthusiast with experience working with SharePoint, Windows Server and Windows desktop.

Auditing Admin Activities – reporting purposes

$
0
0

In many organizations that I visit, they have multiple administrators. If in case something breaks anytime, not a lot of administrators have any clear information on what was changed, what object was modified, who did the change and when was the change done.

This is not only applicable to customers who are on on-premises Exchange servers but also for the customers who are already on Exchange Online.

So, I thought of writing a script to get these information. Although it just uses our age old (yeah, since Exchange 2010 days) method of searching Admin audit logs, it gives it in a clear format for both admin to consume and understand the changes in the environment from Exchange admin console or Exchange management shell. So that if they review it, there is a clear information of what was changed from the Exchange environment perspective.

Use cases:

Exchange Admin takes permissions over another mailbox or adds user to permissions on another mailbox (can be sendas permission or full mailbox permission).

Exchange Admin makes a change with Exchange management shell/Exchange management Console/Exchange administration Center on a change in transport rule etc.

 


 

################################ Disclaimer ###################################

<# This software (or sample code) is not supported under any Microsoft standard support program or service.
The software is provided AS IS without warranty of any kind.
Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose.
The entire risk arising out of the use or performance of the software and documentation remains with you.
In no event shall Microsoft, its authors, or anyone else involved in the
creation, production, or delivery of the software be liable for any damages whatsoever
(including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss)
arising out of the use of or inability to use the software or documentation, even if Microsoft has been advised of the possibility of such damages. #>
# Version 1.0
############################## Start of the script #################################

$smtp = “HTServer1.contoso.com” # This should be a Hub transport server in the environment.
# $smtp has the server name which can be used as a server to connect from the script and submit the email. This can be either an Exchange server/IIS SMTP server/Any other third party servers.
$from = “AdminAuditLogReport@domain.com” #This is just an email address that it shows up in the sender field – can be any email address.
# $from has the name that needs to appear in the from field of the email.
$users = “AuditManager@contoso.com”; # this can be any email address you would like to have.
#$users = “Auditmanageruser1@contoso.com”,”Auditmanageruser2@contoso.com”; #You can add any other users also who would like to receive this alert email for Admin Audit logs
$AdminAuditCount = 0
$Begin = $null
$Begin = Get-Date
$Start = $Begin.AddDays(-1)
$end = $Begin
$Filename = $null
$Filename = “.AdminAuditLogSearch_” + $Begin.Hour + $Begin.Minute + “_” + $Begin.Day + “-” + $Begin.Month + “-” + $Begin.Year + “.htm”

$report = $null
$report += “<html>”
$report += “<head>”
$report += “<meta http-equiv=’Content-Type’ content=’text/html; charset=iso-8859-1′>”
$report += ‘<title>Admin Audit Log Search Report</title>’
$report += ‘<STYLE TYPE=”text/css”>’
$report += “<!–”
$report += “td {”
$report += “font-family: Tahoma;”
$report += “font-size: 11px;”
$report += “border-top: 1px solid #999999;”
$report += “border-right: 1px solid #999999;”
$report += “border-bottom: 1px solid #999999;”
$report += “border-left: 1px solid #999999;”
$report += “padding-top: 0px;”
$report += “padding-right: 0px;”
$report += “padding-bottom: 0px;”
$report += “padding-left: 0px;”
$report += “}”
$report += “body {”
$report += “margin-left: 5px;”
$report += “margin-top: 5px;”
$report += “margin-right: 0px;”
$report += “margin-bottom: 10px;”
$report += “”
$report += “table {”
$report += “border: thin solid #000000;”
$report += “}”
$report += “–>”
$report += “</style>”
$report += “</head>”
$report += “<body>”
$report += “<table width=’100%’>”
$report += “<tr bgcolor=’#CCCCCC’>”
$report += “<td colspan=’7′ height=’25’ align=’center’>”
$report += “<font face=’tahoma’ color=’#003399′ size=’4′><strong>These were the Admin Audit logs on the Exchange Environment from $Start till $end </strong></font>”
$report += “</td>”
$report += “</tr>”
$report += “</table>”
$report += “<table width=’100%’>”
$report += “<tr bgcolor=#CCCCCC>”
$report += “<td width=’10%’ align=’center’>Admin Who Ran the command</td>”
$report += “<td width=’10%’ align=’center’>Commandlet Run</td>”
$report += “<td width=’10%’ align=’center’>Modified Object</td>”
$report += “<td width=’10%’ align=’center’>Succeeded</td>”
$report += “<td width=’10%’ align=’center’>Run Date</td>”
$report += “<td width=’10%’ align=’center’>Server on which the commandlet was run</td>”
$report += “</tr>”
$AdminAuditLog = Search-AdminAuditLog -StartDate $Start -EndDate $end | where{($_.cmdletname -notlike “New-Mailbox”) -and ($_.cmdletname -notlike “enable-mailbox”) -and ($_.cmdletname -notlike “Set-Mailbox”) -and ($_.cmdletname -notlike “New-Mailcontact”) -and ($_.cmdletname -notlike “Set-Mailcontact”) -and ($_.CmdletName -notlike “Add-DistributionGroupMember”) -and ($_.cmdletname -notlike “Remove-DistributionGroupMember”) -and ($_.Cmdletname -notlike “Set-DistributionGroup”) -and ($_.Cmdletname -notlike “New-DistributionGroup”) -and ($_.Cmdletname -notlike “New-MoveRequest”) -and ($_.CmdletName -notlike “New-MailboxExportRequest”) -and ($_.CmdletName -notlike “New-MailboxImportRequest”) -and ($_.CmdletName -notlike “Suspend-MoveRequest”) -and ($_.CmdletName -notlike “Remove-MoveRequest”) -and ($_.Cmdlet -notlike “Search-Mailbox”)}
# The above line has a lot of conditions that should not go in the report. The reason being – Admin Audit logs keeps track of any command which modifies an object on the Exchange server (Set, enable, add, remove, move, new etc.)
# Having this all other information is too much junk every day. Only the major changes should be monitored.
if ($AdminAuditLog -ne $null)
{
foreach ($Entry in $AdminAuditLog)
{
Write-Host $Entry.caller `t $Entry.cmdletname `t $Entry.ObjectModified `t $entry.Succeeded `t $entry.RunDate `t $entry.OriginatingServer -ForegroundColor GREEN
$caller = $entry.caller
$cmdlet = $entry.cmdletName
$objmodified = $entry.objectmodified
$success = $entry.succeeded
$RunDate = $entry.RunDate
$servername = $entry.originatingServer
$report += “<tr>”
$report += “<td>$caller</td>”
$report += “<td bgcolor=’Yellow’ align=center>$cmdlet</td>”
$report += “<td>$objmodified</td>”
$report += “<td>$success</td>”
$report += “<td>$RunDate</td>”
$report += “<td>$Servername</td>”
$report += “</tr>”
$AdminAuditCount += 1
}
}
else
{
Write-Output “No Commands were run which modified any objects in AD from $start till $end.”
}

$report += “</table>”
$report += “<br>”
$report += “Note: This will Not Include any changes done on the Edge servers/Any modifications done in AD directly logging into AD servers(eg: using ADSIEDIT/DSA/Any AD tools)”
$report += “</body>”
$report += “</html>”
Write-Output $AdminAuditCount
if ($AdminAuditCount -ge 1)
{
foreach ($user in $users)
{
Send-MailMessage -BodyAsHTML $report -SmtpServer $smtp -From “$from” -To “$user” -subject “Administrator Audit Logs” # send emails to all the users which you had specified in the $users
$report | out-file -encoding ASCII -filepath .reportsAdminAuditLogs$Filename
# The above line saves a copy of the report in the location. So, there needs to be folder structure created for this.
}
}

############################## End of the script #################################

 


 

The above commandlet checks for admin activities in any environment – in terms of what changes were performed by the admin except the actions excluded above and sends an email to all the email addresses that is mentioned in the $users.

Admin audit logs are saved for a period of 90 days by default – customizable in on-premises Exchange server.

 Hope this helps a few of the admins out there who want to know what all changes were done in terms of Exchange environment in a specific duration.

Power BI can analyze your tweets!

$
0
0

My colleague Sam Lester has been working on creative ways to showcase how to use Power BI. His latest post talks you through analyzing your tweet history with Power BI:

https://blogs.msdn.microsoft.com/samlester/2016/07/06/create-your-personalized-twitter-analytics-dashboard-in-power-bi-in-10-minutes/

It’s a nice way to answer some questions you might not have thought to ask — like which day you most often tweet. I don’t do a lot on twitter, but even with a limited data set it is interesting see I take Wednesday “off” from tweeting, don’t do much on Thursdays but tweet a lot more on Friday and Saturday.

Sam’s also done a nice job to let you download the template and walk you through getting your own data and importing into. Just a couple of clicks and you should be able to make your own dashboard!

 

 

tweety

 

 

 

Office 365 六月新功能—Microsoft Planner 上市、Android 新增筆跡功能、Office 365 安全性功能

$
0
0

Windows、Mac、iOS 和 Android 上的 Office apps 在六月做了很多更新。我們在 Planner 和 GigJam 上到達了一個新的里程碑,同時也推出 Office 365 進階的安全性管裡、新的數據中心區域,並將下一個累計功能更新至 Office 2016。

 

Office 筆跡功能可以在更多的裝置與應用程式中被執行

今年一月,我們在 Windows Desktop 與 iPad 上的 Word、Excel、PowerPoint 和 OneNote 上新增了筆跡功能,你可以使用你的手指、筆或觸控筆,讓寫作或繪圖的方式更為直覺、強大。這個月我們針對 Office 的測試人員,將筆跡功能擴大至 Android 裝置 OneNote 以外的 WordExcel 和 PowerPoint 上。筆跡功能現在也能在 Windows Phone 上的 WordExcelPowerPoint 中使用,而下個月 iPhone 也將會更新此項功能。

New-to-Office-365-in-June-1b

Android 平板與手機上的 Office 現在也可以使用筆跡功能。

 

我們正在將形狀識別功能擴大至 Windows 電腦與 iPad 的 Excel 上,而其也會隨著下個月 iPhone 筆跡功能的版本更新,適用在 iPhone 上。形狀識別功能將手寫且較為粗糙的圖形,轉換成完美、好看的樣子。套用在 Excel 上,可以輕鬆的創建吸引人的儀表板、創建自訂按鈕連結等。形狀識別功能將擴展至電腦版與 iPad 上的 PowerPoint,下個月也可在 iPhone 使用。最後,我們也即將在 Word 上推出形狀識別功能。

New-to-Office-365-in-June-2d

Excel 中的形狀識別功能,讓你輕鬆創建自訂儀表板的樣式!

 

而桌面應用程式的部分,我們將為 Windows PCs 的訂閱戶帶來 Visio 的標籤繪製與即時筆跡功能。你可以輕鬆加上註解,用複雜的圖表與流程繪製你的回應。另外,我們因應 Mac 用戶的請求更新 OneNote,讓基於觸控面板的筆跡功能有分段讀取的功能,也能支持協力廠商的手寫筆與用繪圖筆開啟平板電腦與顯示器。

New-to-Office-365-in-June-3b

Mac 裝置上的 OneNote 正推出筆跡功能。

 

向 Office 365 訂閱戶推出 Sway 的新功能

Sway 是一個可以用數位化的方式說故事的應用軟體,讓使用者可以輕鬆的創作或分享視覺化且具吸引力的演講、新聞簡報、個人部落格等。Sway 自從在去年八月推出後,我們看見專業人士、學生、部落客製作了成千上萬有創意且令人印象深刻的 Sways。現在,除了免費版本的 Sway 以外,Office 365 的客戶、商業與教育版的訂閱者,可以用以下三項新功能,創作更強大、專業的 Sways,也能更精確的控制分享。

  • 用密碼保護你的 Sways:

利用密碼的保護,控制誰能觀看你的 Sways。在企業組織及學校組織的用戶共享方面,每一個擁有 Office 365 帳號的人,都可以透過密碼的保護,得到額外的安全保障。

 

  • 內容長度限制延長:

可以創作更長、更強的 Sways,容納更多的圖片、影片、圖表等。這對於需要較長內容的場合很有幫助,例如:員工訓練、學生專案與旅行報告等。

 

  • 移除 Sway 的結束頁尾:

為了讓你內容的外觀更為客製化,現在你可以移除 Sway 最後的訊息頁尾。

New-to-Office-365-in-June-4b

Office 365 的客戶現在可以創建更強大的 Sway,並增加密碼的保護。

用 Outlook 輕鬆掌控你的旅行與包裹

我們現在在 Outlook 增加新的使用者經驗,讓你可以輕鬆掌控你的旅行與包裹傳遞。Outlook 已經可以自動從你的email增加活動進入你的行事曆,很快的,你可以在收件夾與行事曆中看見這些活動的摘要卡片,提醒你重要的細節。你可以很快的確認航班、改變住宿的預訂、追蹤包裹。另外,你會有獨立、可操作的提醒,讓你掌控你的航班的登機資訊。這些功能也開始在 Mac 版與 Outlook web 上推出。而在 Windows、iOS、Android、Windows 10 Mail 和 Calendar app 上也很快會推出。

*想看更多詳情請至:new travel and delivery experiences

 

Microsoft Planner全面上市

Microsoft Planner 在全球推出,給所有符合資格的 Office 365 商業版客戶使用。Planner 讓企業、學校與組織有個全新、改善的方式組織團隊、完成更多的工作。團隊可以創建新的計畫;組織、分配、合作任務;設定截止日期;更新狀態和共享資料夾,同時視覺化的儀表板與 email 提醒功能,可以幫助追蹤你的專案。

*瞭解更多關於 Planner 的資訊:Microsoft Planner ready for showtime

*Microsoft Planner 輕鬆上手:Get started quickly with Microsoft Planner

GigJam 的預覽功能全面開放

我們在這個月初宣布 GigJam 可以廣泛地用預覽的方式取用,不管在 Windows、Mac、iPhone 和 iPad 的使用者都可以預覽。GigJam 是為擁有協作心態的人設計的,它讓你可以自動地、短暫的讓他人參與在你的工作裡。它會從應用程式當中搜集所有你需要的即時資訊,並可用打圈與打叉的方式區分你想要的資訊;它會控制組織內外的人,可以看到、甚至在現實中與你共同合作。從拜訪 aka.ms/gigjam 和 App Store 開始,並在 UserVoice 分享你的回饋。我們會持續根據你的反饋,更新 GigJam。我們將跟著 Office 365 在今年底全面上市。

新的順延通道建立,商業版客戶的 Office 365 區域與安全功能

這個月我們為了商業版的訂閱戶做了若干的更新,提供組織更多的彈性、可管理性與控制權:

  • 新的順延通道建立:

 

第二版的 Office 365 順延通道

已可使用,這次的建立有效地結合二月發佈的 Office 2016,和前四個月的安全性更新。這次順延通道的選項,可以減少 Windows 桌面應用程式功能改變的頻率,也提供 IT 人員額外的實踐去驗證加載項目、巨集、自訂企業營運的應用程式等。

*瞭解更多這個月更新的內容:Office 365 Client Release Note

  • 加拿大與南韓的 Office 365 資料中心:

用你的方式擁抱以雲端為優先的世界!新的 Office 365 資料中心區域,現在已經在加拿大全面開放,提供加拿大客戶核心資料的國內資料落地、容錯轉移、嚴重損壞修復。我們也擴展 Microsoft Cloud,容納南韓的資料中心區域。這些新的資料中心區域加入了日本、澳洲、印度,和最近宣布擴展的英國與德國的行列。

  • Office 365進階安全性管裡:

今年月初我們引進了進階的安全性管理,一套由 Microsoft Cloud App 安全防護增強的功能,提供加強的可見度和 Office 365 的環境控制;用威脅偵測來監控安全性事件、辨認高風險還有不正常使用;用精細的控制與安全政策來量身打造 Office 365;用探索 app 的儀表板增加 Office 365 的可見度和其他生產力雲端服務使用。

*觀看《From Inside the Cloud》瞭解更多詳情:

*瞭解更多本月 Office 365 的更新資訊:

Office 2016|Office for Mac|Office Mobile for Windows|Office for iPhone and iPad|Office on Android

*家用版與個人版的 Office 365 客戶,確保有申請加入 Office Insider ,能在第一時間了解最新的更新!

*商業版的客戶都可以從 First Release 找到協助。

感謝大家持續的回饋與支持!

—Kirk Koenigsbauer ( Office 團隊 副總裁)

IT 캠프에서 사용했던 실습 교재와 관련 가상 컴퓨터

$
0
0

2016년 6월까지, 대략 4년정도에 걸쳐, 다양한 IT 엔지니어 기술을 IT 캠프라는 이름으로 실습 세미나 형태의 진행을 하였습니다.

image

기존 형태는 IT 캠프에 참석하신 후, 이에 대한 VM을 요청하시면, 별도의 다운로드 링크를 제공해드렸는데, 금일 자로, 모든 실습 자료(PDF) 및 가상 컴퓨터들의 묶음을 FTP로 제공합니다.

해당 자료들은 모두 Microsoft의 공식적인 배포 형태가 아닌, 꼬알라가 캠프를 준비하면서, 만들었던 형태의 자료들이라는 것을 유의하시면 좋겠습니다. 이에 상업적 용도로의 사용은 안됩니다. (실습 매뉴얼의 원본 형태의 요청은 받지 않겠습니다.)

더불어, FTP 서버는 제 데모 인프라내에서 구성해놓았기에, 접속자 제한(3명) 및 용량 제한, 그리고 유휴 접속 제한 시간이 걸려있습니다. Smile 또한 FTPS 형태로 구성되어 있기에, FileZilla등의 클라이언트 프로그램을 이용하셔서, 접속하시면 됩니다.

image

암호가 변경되었을 경우, 해당 포스트를 업데이트하겠습니다. 대부분의 VM의 경우, 다운로드 후, Hyper-V에서 가져오기 형태로 진행하시면 됩니다. 또한 평가판으로 제작하였기에, 평가판 만료된 OS는 SLMGR /REARM 형태의 연장을 이용하시면 되며, SQL 및 System Center는 제품키를 입력하시면 구동이 가능합니다. Azure의 경우에는 폴더내 VHD 파일을 Azure내 Storage로 업로드하신 후, 이용하시면 됩니다.

아무쪼록 기술 스터디에 미약하지만 도움이 되셨으면 합니다.

Site collection provisioning and customisation scenarios for SharePoint online

$
0
0

Site collection provisioning and customisation scenarios for SharePoint Online

I’ve worked with a few customers recently, whom are either starting a deployment project in SharePoint Online, or some way into one, and one of the challenges they had to deal with is how best to provision customised site collections.

This article covers site collection provisioning and customisation for the following scenarios:

1/ Bulk site collection provisioning and customisation

2/ Ongoing/ requested site collection provisioning and customisation.

Scenario 1 – Bulk provisioning of site collections.

Some customers must deploy large numbers of customised site collections, which means that provisioning site collections manually is not really an option.

To provide some real world context, I worked with a customer a few months ago that was facing exactly this challenge.  They had to provision ~250 site collections (one for each branch office); each site collection had several sub-sites and was relatively customised (custom master page, lists/libraries, content types etc).  Manually it was going to take about 3-4 hours per site collection to apply the customisations. This, for obvious reasons, this option could not seriously be considered due to resource requirements.

A solution that I worked with the customer to develop, was to automate the site collection provisioning process using PowerShell. We agreed that a CSV file would act as a work list containing the information required to provision site collections. The process would take each entry in the CSV file, provision a site collection, and apply customisations to it depending on the type of site collection required.  The flow/ process looked something like this:

bulk-sc-flow

PowerShell…

PowerShell has been around for a long time now, and most customers that I work with are very experienced with PowerShell, but often haven’t had to use anything outside of the Cmdlets that are provided with the on-premise SharePoint Snap-in.

A brief tangent from the story here, if you haven’t had to write any PowerShell for SharePoint online so far, then there are a few things that you should be aware of:

  1. SharePoint Online management shell (introduction & download) – is a really useful tool if you are a SharePoint Online administrator.  It contains approx. 42 Cmdlets that enable to you to perform basic administration tasks; however, you should understand a few things:
    1. The smallest entity that you can work with is a Site Collection; therefore, you will not be able to access a sub-site, list/ library, etc.
    2. To use the SharePoint Online management shell, you must be a SharePoint Online global administrator.

 

  1. SharePoint Online objects – are different from their on-premise cousins.  Try comparing the results from returning a SharePoint Online site collection (Get-SPOSite) and the On-Premise equivalent (Get-SPSite).  You’ll notice if you run a Get-Member (e.g. Get-SPSite -Identity https://intranet.contoso.com | Get-Member) on both, that the SharePoint Online site collection object is actually a Microsoft.Online.SharePoint.PowerShell.SPOSite.  It has no real methods of use, and some properties are read-only; it also has substantially less properties than the on-premise equivalent.

 

  1. CSOM – If you have been scripting with PowerShell for SharePoint (on-premise) for a few years, then there is every chance that you haven’t had to deal with the CSOM. The Client Side Object Model offers an extensive number of SharePoint objects that you can work with, from your client PC/ any server that has CSOM available.  It does however, require you to really think about how you are going to approach your script as you should understand ‘request batching’.

Back to the scenario!

PowerShell was the preferred choice for this customer scenario; because using a CSV input file it meant that the customer was in control over the provisioning process; and PowerShell is (debatably) easier to read. There is no reason you couldn’t take another approach.

However, as I will iterate several times, there are a few differences to the approach you must take developing a PowerShell script for SPO as opposed to On-premise installations of SharePoint.

Connections

– whether you are going to run Cmdlets from the SharePoint Online management shell, use the CSOM in your PowerShell script or the PnP PowerShell Module, you are not running your code directly on the SharePoint server. This means that you must establish a connection with your SPO tenant from your local system, or the system running the code, before you attempt to do anything else.

This is achieved by running the Connect-SPOService Cmdlet, providing the Url of your admin tenant, and the username of your SharePoint Online global administrator account:

$adminCenterUrl = "https://<tenant>-admin.sharepoint.com"
$adminAcct = "admin@<tenant>.onmicrosoft.com"
Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
Connect-SPOService -Url $adminCenterUrl -Credential $adminAcct

 

Also, please note that in the code above, I have imported the SharePoint Online management shell module prior to running the Connect-SPOService Cmdlet.

A few things to note here… The script is going to prompt you for the password of the account (credential) that you use.  If you want to run the script unattended (i.e. you won’t be around to type the password in), then I invite you to look into using Windows Credential Manager:

Store credentials using Windows Credential Manager:

  1. Open the control panel (from run: control panel).
  2. Click on User Accounts
  3. Click on Credential Manager
  4. Click on Windows Credentials
  5. Click Add a generic credential
  6. Enter the following information:
  7. Internet or network address – think of this as a reference or alias
  8. User name
  9. Password

You should end up with something like this:

Credential-manager

You can now make an adjustment to your script:

Connect-SPOService -Url $adminCenterUrl -Credential SPOAdmin

The script will now refer to the credential that you just created; meaning you can run the script with a scheduled task if needed (please ensure that you have logged on with the account that you intend on using to ensure that a local profile exists).

Input file…

The script that we developed took a CSV file (see below) as its input with the intention of iterating through each line and provisioning a site collection using the OOTB team site.  We achieved this using the Import-Csv Cmdlet:

$build = Import-Csv -Path "C:tempSiteCollectionBuild.csv"

Templating…

Back in the day, customers would often create a site, configure it for a purpose through the user interface, save it as a template, and then make it available to end users. This is fine, however, it does present a few administration challenges.  Nowadays I always recommend to customers that they first define what a site should look like (e.g. document any sub-sites, lists, libraries, content types, web parts etc) and provision them with code; either CSOM or PowerShell. This does mean that you must have the skillset to write the code/ script but in my opinion provides a more controlled method for deploying site collections.  We’ll be talking a bit more about this in the next scenario; however if this approach to site collection ‘templating’ makes you feel nervous, then perhaps have a watch of this channel 9 session where Vesa Juvonen explains it in far more detail than what I am going into here.

How we managed the type of site in the customer scenario that I am walking you through was the addition of a ‘Site template’ column in the CSV file:

input-file

Now, in the provisioning code, we can detect the type of site collection that should be provisioned (if $siteCollection.Type -eq “ProjectSite”):

foreach ($siteCollection in $buildList)
{
$site = $siteCollection.Url
Write-Host "Provisioning $site" -ForegroundColor Yellow
New-SPOSite -Url $siteCollection.Url -Owner $siteCollection. Owner -Title $siteCollection.Title -StorageQuota 1000 -Template STS#0 -LocaleId 1033
if ($siteCollection.SiteType -eq "ProjectSite")
{
New-SPOList -Title "Project Risks" -Template GenericList -Url Lists/ProjectRisks -OnQuickLaunch
New-SPOList -Title "Project Issues" -Template GenericList -Url Lists/ProjectIssues -OnQuickLaunch
New-SPOList -Title "Project Documents" -Template DocumentLibrary -Url Lists/ProjectDocuments -OnQuickLaunch
New-SPOList -Title "Project Design" -Template DocumentLibrary -Url Lists/ProjectDesign -OnQuickLaunch
}if ($siteCollection.SiteType -eq "TeamSite")
{
New-SPOList -Title "Team Social" -Template GenericList -Url Lists/TeamSocial -OnQuickLaunch
Add-SPOField -List "Team Social" -DisplayName "Event Name" -InternalName "EventName" -Type Text -AddToDefaultView
}
}

 

A quick walkthrough of the code above.

We’ve imported the contents of the CSV file into a variable called $buildList. We then iterate through each line of the $buildList, and provision a new site collection using the OOTB team site template (STS#0).

At this point, we check to see what type of customisations we must apply to the site collection. This is managed by the conditional ‘if’ statement:

If ($siteCollection.SiteType -eq “ProjectSite”)
{
<Apply customisations>
}

The code between the curly braces is where we start to provision the relevant artefacts that make up the type of site; e.g. ‘ProjectSite’. In the example that I am using here, you can see that I am provisioning new custom lists and fields. In your scenario you might also be deploying branding, content types, etc.  The SharePoint Online management shell doesn’t have any Cmdlets that provide this functionality, so how is this working?

Office Dev Patterns and Practices…

The script that the customer and I wrote heavily used an amazing community collaboration program called the Office Dev Patterns and Practices.  This is a collection of source code projects that address common challenges/ requirements, and in addition to this, an extremely comprehensive collection of PowerShell Cmdlets designed to help customers be incredibly efficient.

This PowerShell module is how we apply the customisations to our site collection.

The PowerShell Cmdlets are available at GitHub. Here is how to install and start using the module (download it before proceeding J):

Install PnP-PowerShell-master binaries

  1. Download and extract PnP-PowerShell-master from the GitHub repository.
  2. Navigate to the Binaries folder.
  3. Run SharePointPnPPowerShellOnline.msi
  4. Choose (or not) to accept the terms in the license agreement.
  5. Click Install.

Alternatively you could use the Install-Module Cmdlet:

Install-Module SharePointPnPPowerShellOnline

Now you are good to start using the Cmdlets. If you head over to the GitHub repository you’ll be able to review the Cmdlet documentation, which is very thorough and get an idea of what you can now do with your SPO tenant!

The complete code now looks like this:

$adminUrl = "https://<tenant>-admin.sharepoint.com"
$spoAdmin = "admin@<tenant>.onmicrosoft.com"
$buildList = Import-Csv -Path C:tempSPOSiteCollectionBuildList.csv
Connect-SPOnline -Url $adminUrl -Credentials $spoAdminforeach ($site in $buildList)
{
New-SPOSite -Url $site.Url -Owner $site.Owner -Title $site.Title -Template STS#0 -StorageQuota 2000 -LocaleId 1033If ($site.Type -eq 'ProjectSite')
{
<#Apply customisations, e.g. branding, content types, lists & libraries, etc

#>
}
}

That wraps up the first scenario; I hope that you find it useful.

Scenario two – ongoing provisioning and customisation of site collections

 

The second scenario that I want to talk about is managing the ongoing requests for site collections.

In the first scenario we discussed a fantastic community program (Office Dev Patterns and Practices). We’re going right back to this now as there is a fantastic code sample that provides us with a solution to provisioning site collections. Please visit this URL.

This is a great example of asynchronous site collection provisioning using a provider-hosted SharePoint add-in. Vesa Juvonen has a very detailed walk through of the code and flow in a channel 9 video.

The flow is remarkably like what we covered in the bulk site collection provisioning scenario:

async-flow

The process starts off with the user completing a custom site collection request form, like this example (note: the screenshot was taken from a high-trust add-in implementation, so ignore the managed path selection box):

sc-provisioning-form

Once the user has submitted the request, the user receives confirmation:

sc-provisioning-conf

And at this point a list item is created in a site provisioning request list.

That is the first phase of the provisioning process completed.  The second phase is to provision the site collection.

Much like the bulk site collection creation scenario that we covered earlier, we have a couple of options available to us:

  1. A console application
  2. A PowerShell script

Either of the above would be able to build a site collection and apply the customisations depending on the type of site required.

If you are new to provider-hosted add-ins, then here are some useful resources:

Introduction to SharePoint add-ins: https://msdn.microsoft.com/en-us/library/office/fp179930.aspx

Hands on lab: getting started with SharePoint Add-ins – https://dev.office.com/hands-on-labs/1518

 

 

 

 

 

 


Real time protection status issue in OMS Security and Audit Solution

$
0
0

Summary: Troubleshoot servers with real time protection enabled that are not being correctly reported.

 

Good morning everyone, Mark Waitser and Yuri Diogenes here. Today we want to talk about an issue where certain servers are not being correctly reported in relation to real time protection status.

Problem

A Windows Sever 2012 computer with Microsoft System Center Endpoint Protection installed and Real Time Protection enabled is reported in OMS Console as the real time protection was not enabled (ProtectionStatus : No real time protection).

Note: this may also occur in Windows Server 2008 or Windows 7 SP1.

Cause

Microsoft System Center Endpoint Protection is detected, but ProtectionStatusRank equal to 270 – No Real Time Protection as shown below:

7-6-16-1

Troubleshooting steps

  • Verify if all monitoring are enabled, see example below:

7-6-16-2

  • Noticed that the “Behavior Monitor” is disabled and this is the reason for the 270

Solution

Enable all Monitors via SCEP management console as shown below:

7-6-16-3

Authors

Mark Waitser, Senior Software Engineer (OMS Security Team)

Yuri Diogenes

 

If you use Facebook, you may want to join the Microsoft OMS Facebook site. If you want to learn more about Windows PowerShell, visit the Hey, Scripting Guy Blog.

If you would like to get a free Microsoft Operations Management Suite (#MSOMS) subscription so that you can test it out, you can do so from here. You can also get a free subscription for Microsoft Azure as well by selecting this link.

Mission Marketing 2016

$
0
0

Im Rahmen unserer Gespräche mit Partnern stellen wir immer wieder fest, wie stark sich die Anforderungen an das Marketing in den letzten Jahren – nicht zuletzt durch die Cloud – geändert haben. Zum einen verändert sich die Bedeutung der berühmten 4 P’s (Product, Price, Place, Promotion), zum anderen wird der Kostendruck und der Ruf nach einer funktionierenden Marketingerfolgsmessung immer größer.

Bei den 4 P’s ändert sich die Betrachtungsweise. Unser “Product” ist ein Service und damit nicht mehr tangible. Unser “Place” ist nicht mehr der Ort, an dem das Produkt vorgefunden und gekauft werden kann, sondern vielmehr die Orte, an denen der Service genutzt werden kann (im Büro/unterwegs, am PC/am Smartphone). Der “Price” weicht einer monatlichen Nutzungsgebühr, die zudem nach Nutzung variiert. Andere Quellen argumentieren gar, dass der Marketing Mix für Services auf 7 P’s erweitert werden muss (Processes, People und Physical Evidence) – siehe hier oder hier.

Der steigende Kostendruck im Marketing (aber auch im Sales) resultiert aus dem Wechsel von Mehrjahresverträgen mit großem Order-Entry hin zu (cloud-typischen) monatlichen, nutzungsbasierten Abrechnungen mit flexibleren Möglichkeiten zur Reduktion oder Kündigung. Akquisitionskosten im Sales & Marketing amortisieren sich so erst über einen größeren Zeitraum.

Vor diesem Hintergrund wird klar, warum gerade das digitale Marketing in den letzten Jahren an Bedeutung gewonnen hat. Keine Frage: auch digitales Marketing kann sehr teuer sein. Muss es aber nicht – wenn Sie die richtigen Kanäle geschickt nutzen und Streuverluste minimieren.

Um Partner bestmöglich bei der Umsetzung ihrer Marketingstrategie – mit Bausteinen des digitalen Marketings – zu unterstützen, hat das Partner Marketing Team zahlreiche Angebote ausgewählt, geprüft und für Partner zentral zusammengestellt. Im neuen Marketing Service Portal für Partner.

 

MarketingServicePortal

Egal welche Ziele Sie verfolgen – Ihre Bekanntheit steigern, die Wahrnehmung Ihrer Marke schärfen, Bestandskunden binden oder Neukunden gewinnen – im Marketing Service Portal für Partner finden Sie die richtigen Bausteine.

Per Filter oder per Freitextsuche finden Sie schnell die richtigen Angebote. Teilweise als kostenlosen Best-Practice Guide mit Hinweisen unserer Experten, teilweise als paketierte Angebote einer unseren Agenturen zum Vorzugspreis.

 

 

 

 

MissionMarketing

Mit der Mission Marketing 2016 schließlich bringen wir alles zusammen und nehmen Sie mit auf eine Entdeckungsreise – “Der sichere Weg zu galaktischen Umsätzen”. Alle zwei Wochen bekommen Sie eine “Mission” per Mail und beschäftigen sich mit einem ausgewählten Marketingthema.

 

 

 

 

Nutzen Sie die zahlreichen Angebote im Marketing Service Portal für Partner und registrieren Sie sich noch heute für die Mission Marketing 2016.

 

 

Dramaturgie in Zeiten der Digitalisierung

$
0
0

Im Rahmen des interkulturellen Theaterprojekts PHONE HOME befasst sich Michael Sommer als Gastautor auf dem Microsoft Presseblog mit den Themen der Digitalisierung, vernetztem Arbeiten und Kommunikationstechnologien im Kontext des Theaters. Er leitet als Artistic Director den deutschen Teil von PHONE HOME für das Pathos München. Das internationale Projekt findet in Kooperation mit Theaterhäusern in London und Athen statt – vernetzt durch Microsoft, indem Skype eine Verbindung zwischen allen Bühnen schafft und die Aufführungen in den verschiedenen Städten zu einer zusammenführt.

Phone_Home

 

Eins vorweg: E-Castings gibt es in den deutschen Stadt- und Staatstheatern noch nicht – jedenfalls nicht, dass ich wüsste. In der Filmbranche sind sie mittlerweile sehr verbreitet: anstatt Schauspieler zu Castings und Probeaufnahmen einzuladen, senden diese ein kurzes Video ein oder man vereinbart gleich einen Skype-Termin. Es schont das Budget der Produktionsfirma, kostet den Darsteller weniger Zeit und einen Eindruck davon, wie ein Schauspieler auf dem Bildschirm wirkt, gewinnt der Caster in jedem Fall. Das Theater hat freilich andere Bedürfnisse. Wo sind vernetzte Arbeitsweisen hilfreich und sinnvoll? Zum Stand der Dinge aus Sicht eines Dramaturgen.

Was ist Dramaturgie eigentlich?

Die meisten deutschen Stadt- und Staatstheater sind Betriebe mit einem festen Ensemble und einer produktionsorientierten Arbeitsweise. Die Rolle der Dramaturgen ist dabei eine dreifache:

  • Erstens erarbeiten sie mit den (oft externen) Regisseuren eine Fassung des zu spielenden Stücks und begleiten die Inszenierung als „erster Zuschauer“ indem sie kritisieren, befragen, bestätigen.
  • Zweitens gestalten Sie als “Gehirn des Theaters” mit dem Intendanten zusammen einen Spielplan, indem sie viele Stücke lesen, wenige davon auswählen und dann Besetzungen basteln.
  • Drittens sind sie als “Sprachrohr der Kunst” für alle Arten von Theatervermittlung, vom Programmheft über Publikumsgespräche bis zu Videoclips zuständig – für alles, was die Kunst ins Gespräch bringt.

Der Kern ist die Produktionsdramaturgie, und wenn wir von vernetzten Arbeitsweisen sprechen, dann spielt sie in der Vorbereitung einer Produktion die größte Rolle. Kein Stück wird heutzutage exakt so auf eine deutsche Bühne gebracht, wie der Autor oder die Autorin es geschrieben hat. Das Team aus Regisseur und Dramaturg setzt eigene Schwerpunkte, nimmt inhaltliche und pragmatische Striche vor und erarbeitet seine eigene Fassung.

 

Vergilbte Kunst? Hinter den Kulissen stehen durchaus moderne Technologien (Foto: Michael Sommer)
Vergilbte Kunst? Hinter den Kulissen des Theaters stehen durchaus moderne Technologien (Foto: Michael Sommer)

 

Der Wandel der dramaturgischen Arbeit

Die gemeinsame Produktionsvorbereitung von Dramaturgen und Regisseuren war bis in die Neunziger Jahre hinein fast nur im persönlichen Gespräch möglich. Regelmäßige Arbeitstreffen vor Probenbeginn sind nicht immer einfach zu organisieren, denn Theaterregisseure sind in der Regel Arbeitsnomaden, die nur für mehrere Wochen an einem Theater gastieren und dann weiterziehen.

Natürlich kann man zu Einzelfragen telefonieren, doch wenn man einen Text mit vielen Änderungen und Umformulierungen vor sich hat, wenn es zu erwägen gilt, was eine Formulierung bedeutet und wenn diskutiert wird, ob die dadurch ausgelösten Assoziationen in die richtige Richtung gehen – dann stoßen auch erfahrene Kommunikatoren an ihre Grenzen. Denn es ist kein Geheimnis: Der größere Teil von Kommunikation findet nicht über den verbalen Kanal statt, sondern über Gestik, Mimik, Haltung und all jene Handlungen, derer wir uns im Alltag kaum bewusst sind.

Die große Aufgabe einer künstlerischen Produktion im Team, nämlich eine gemeinsame Sprache zu finden, ist ein Vorgang, der sich nicht ausschließlich in Sprache ereignen kann. Die erste wirkliche Alternative zur Zusammenarbeit zwischen Regisseur und Dramaturg und in ähnlicher Weise zwischen Regisseur und Bühnen- bzw. Kostümbildner ist die Videokonferenz. Natürlich erfordert es einige Übung, miteinander zu skypen, und manch schlechte Internetverbindung an einem Ende kann frustrierend sein. Doch wenn die technischen Voraussetzungen stimmen, dann ist die virtuelle Zusammenarbeit ein echte Alternative zu persönlichen Arbeitstreffen, die in Zeiten knapper Ressourcen immer weniger realisiert werden können.

Das Mittel der Wahl für die Kommunikation bleibt Skype

Wie viele andere Institutionen tun sich öffentliche Theater manchmal schwer mit aktuellen gesellschaftlichen Entwicklungen Schritt zu halten. Dennoch ist das verbreitete Vorurteil, Theater seien “verstaubt” und “von gestern” ebenso falsch, wie das Vorurteil, Computerspiele seien “gewaltverherrlichend” und “verdummend”. Nein, auch die deutschen Stadttheater sind in der Gegenwart angekommen und nehmen viele Trends der medialen Revolution auf. Theater war schon immer eine äußerst mobile Kunstform. Nicht erst seit der Trend zur Verkleinerung oder gar Abschaffung der Ensembles zugunsten freier Schauspieler geht, befinden sich die Künstler immer nur auf Zeit an einem Ort. Gerade für freie Künstler, doch mehr und mehr auch bei Stadt- und Staatstheatern werden Arbeitsformen zur Norm, die sich neuer Kommunikationstechnologien bedienen, um ortsunabhängig zu sein.

Natürlich werden Emails geschrieben und Kurznachrichten oder soziale Medien genutzt. Doch das Mittel der Wahl für die Kommunikation bleibt Skype. Die Anwendung ermöglicht dank Video-Telefonie eine ganzheitlichere Form der Verständigung, die im künstlerischen und auf Menschen fokussierten Produktionsprozess unabdingbar ist.

Ich habe in meiner Zeit als Dramaturg am Stadttheater Stückentwicklungen, Stückfassungen, Übersetzungen, künstlerische Planungsgespräche und viele andere Projekte maßgeblich über Videokonferenzen abgewickelt – Tendenz steigend.

Und ja, es wird ab und zu auch schon per Skype geprobt, das muss aber unter uns bleiben, sonst werden wieder Fördermittel gestrichen.

 

Ein Beitrag von Michael Sommer, künstlerischer Leiter des vernetzten Theaterprojekts PHONE HOME.

Portrait-MS-062015-750x894

[Script Of Jul. 7] How to find out how many users are running what version of the client

Kostenlos zum sichersten Windows: Vollbild-Benachrichtigung informiert Nutzer über das Auslaufen des Upgrade-Angebots

$
0
0

Das kostenlose Upgrade-Angebot auf Windows 10 für qualifizierte Windows 7 und Windows 8.1 PCs läuft noch bis zum 29. Juli 2016. Damit haben Windows Kunden nicht einmal mehr einen Monat Zeit, sich für den kostenlosen Umstieg auf die neueste Windows Generation zu entscheiden. Microsoft empfiehlt allen Privat- wie Unternehmenskunden das kostenlose Upgrade für die sicherste Nutzererfahrung. Microsoft informiert daher kontinuierlich über das Angebot und macht Kunden den Zugang zum Upgrade so leicht wie möglich.

Knapp einen Monat vor dem Auslaufen des kostenlosen Upgrade-Angebots erweitert Microsoft diesen Service nun über eine Vollbild-Benachrichtigung, die auf Rechnern mit Windows 7 (Service Pack 1) und Windows 8.1 erscheinen kann.

Upgrade Benachrichtigung

Die blau-unterlegte Nachricht informiert Nutzer über das Ende des kostenlosen Angebots am 29. Juli 2016 und empfiehlt Anwendern dieses vor dem Auslaufen wahrzunehmen, um von den Vorteilen der bisher sichersten Windows Version zu profitieren. Im Rahmen des Screens wählen die Nutzer, ob sie das Upgrade direkt ausführen wollen, später erinnert werden möchten oder gar keine Benachrichtigung mehr wünschen. Gleichzeitig macht die Erinnerung darauf aufmerksam, dass eine Rückkehr zum ursprünglichen Betriebssystem nach dem Upgrade innerhalb von 31 Tagen problemlos möglich ist.

Nur einige unserer Kunden werden diese Benachrichtigung erhalten
Die Benachrichtigung ist vor allem für diejenigen Anwender gedacht, die bisher noch nicht über das kostenlose Upgrade informiert wurden. Nutzer mit installierter oder verborgener „Get Windows 10“-App auf dem aktuellsten Stand erhalten die Nachricht ebenso wenig wie inkompatible PCs oder Rechner, bei denen Windows 10 nach einem Upgrade deinstalliert wurde oder die Installation des Upgrades fehlgeschlagen ist. Die Erinnerung erscheint auch dann nicht, wenn das Windows 10 Upgrade oder die Benachrichtigung deaktiviert wurde.

Mehr Informationen zur Vollbild-Benachrichtigung gibt es hier. Mehr Details zum kostenlosen Upgrade-Angebot finden Sie unter windows.com/windows10upgrade oder in unserer Windows 10 Pressemappe.

_ _ _ _

Über den Autor:

Markus Nitschke, Microsoft Deutschland GmbH

Markus Nitschke (44), ist seit dem 1. Oktober 2014 Leiter Geschäftsbereich Windows Consumer bei Microsoft Deutschland. Nitschke ist seit insgesamt neun Jahren bei Microsoft. Nach Stationen in Redmond sowie in der Central and Eastern Europe (CEE) Division, wechselte er nun zur Microsoft Deutschland GmbH. In seiner Funktion verantwortet er das Windows Consumer-Segment und berichtet direkt an Oliver Gürtler, Leiter Geschäftsbereich Windows.

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>