Quantcast
Channel: TechNet Blogs
Viewing all 34890 articles
Browse latest View live

The Game of Phishing – How to beat your Opponent

$
0
0

With more than 1.4 Billion clear text user credentials accumulated and up for grabs in the dark web clearly indicates that the hunt for credentials from genuine users/Organizations is the most important phase of the the cyber kill chain.

Verizon Data Breach Investigation report 2017 says that 81% of breaches that have occurred involved compromised credentials and in 75% of them perpetrators were outsiders.

Phishing emails are proving to be most effective way to grab credentials of users and then use them to carry out attacks on them and their organization.

If you are new to this term, This should help.

image

The Mind Game

Over the years, the art of phishing has evolved and adversaries are now using more sophisticated ways to trick human mind in making a wrong judgement.

If you are fan of National Geographic’s popular show “Brain Games” like me, they've showed how amazing ways human brain functions. How a part of brain questions everything, sees with suspicion before making any decision and how another part of brain which simply accepts the fact and assumes it as true and takes the action.

For E.g, When you are about to cross the road, A part of brain looks at this scenario with suspicion and caution. It only makes decision to cross the road after determining that there is no threat to life from incoming vehicle. Lets call this part of the brain – Part 1

However on a different scenario, when you pickup a TV remote and about the press the power button, does you brain sees this act with same level of caution and suspicion? No, right? This time the other part of the brain makes the decision by assuming that when you press the button, the TV will turn on and nothing bad will happen. Lets call the part of the brain Part-2.

Adversaries are now using various physometeric tactics to let your Part 2 of brain acts and makes you take quick decision to act swiftly on the email and supress your other side of the brain which makes you question it.

Lets play a game.

Can you tell which of the following screenshot of the Office 365 Logon Page is a Phishing Page and which one is authentic webpage?

Fig.1

Fig 2

If you happen to land on one of these webpages and provide your credentials, You have just made a life of a attacker easy by handing over one of your organizations critical asset, your Username and Password.

So what could really happen when someone else have your credentials?

1. They can logon to your mailbox and use your email account to send with emails with Malicious attachments to all your colleagues. Since all your colleagues trusts you they will not use their Part 1 of brain to think twice before opening those attachments.

2. They can use your mailbox to attack your friends/family similar way.

3. They can use your mailbox to spear phish senior leaders of your organization to grab their credentials and elevate privileges.

4. They can VPN and connect to your corporate network as you and initiate exploration and then exploitation activities or just spreading a worm based ransomware in the network.

5. and much more Smile

How to be better than the game?

Detecting Signs on a Phishing Email

1. Sense of Urgency - Look for the sense of urgency in the email. If the email is asking you take an action in hurry with words like immediately, Urgently etc, be cautious.

2. Grammatical errors or spelling mistakes- More often or not, attackers from non native English speaking regions tend to make spelling or grammatical mistakes in their emails or on the Phishing site.

3. Spoofed Email Sender - Do not trust the email sender name on your email header. While it might look its coming from known sender but when you expand the Email Name, you may see a different email address (spoofed)

4. Obfuscated URLs : What you see my not be what you get. Hover your mouse on the Links to see the actual URL its taking you to. If you see Base64 in URL, move away.

5. Detailed Email Header – If you are ready to dive deep, Look at the detailed email header to review the complete mail flow and sender Info including IP address and Sender Domain.

6. Email formatting – Emails coming from various reputated organizations goes through multiple review w.r.t formatting. if the email looks weirdly formatted it may be anomalous.

 

Detecting Signs of a Phishing website.

If you do end up on a site by clicking on those URLs in your email, following tips can help detecting it a Phishing Site.

Yes, it would be really taxing to check every email with suspicion and open every links in the email and verify the webpage for signs of phishing.

If you are an Office 365 customers,  Advanced Threat protection of O365 protects against phishing attacks by analyzing the URLs in the emails and blocks the access to the malicious website at the time of click.

If you’d like to see a short demo on how Office 365 ATP Safe link protects against Phishing attack, check this video

To learn more about the Safe Link capabilities of Office 365 ATP, check here

PS: If you have not been able to figure out yet, Figure 2 is a phishing page 🙂

Cheers

Iftekhar


Azure AD B2B…how to work with partners and subsidiaries

$
0
0

 

Azure AD Business-to-Business or Azure B2B is a topic of interest among nearly every organization I speak with. Today many organizations either have a 3rd party IDPs (identity providers) or ADFS deployed and federate with their business partners. Federation establishes a trust whereby providing two-way or one-way access to company resources and applications.

However, with the abundance of SaaS applications that now drive many business functions and processes, companies require a method to allow business partners access to those applications as well.

For example, business partner visibility into inventory management systems, CRM, marketing, O365, and even HR applications is necessary to ease the flow of the business partnerships and allow for collaboration on projects.

Fortunately, Azure Active Directory offers what is called Azure AD B2B where users from an external organization may be invited to access applications of the company who invited them. Another use for Azure B2B is working with subsidiaries or mergers/acquisitions to provide company wide access to resources.

 

Inviting external users to Azure AD

There are a few options that may be utilized when inviting a user to your Azure AD tenant:

  1. Azure AD admin portal
  2. PowerShell – single invite or bulk upload
  3. Microsoft Graph

 

Azure AD admin portal invitation process

  1. Navigate to portal.azure.com as the admin for the tenant you’d like to invite the external user to.
  2. Locate Azure Active Directory and select User and Groups
  3. Select All users
  4. Select New guest user as shown below
  5. Fill in the email address of the user you’re inviting, add a personalized message if necessary and select Invite. From there, the invited user will receive a mail that I display later in this post.

image

 

PowerShell single user invitation process

Using PowerShell to invite an external user is self-explanatory. However, open PowerShell, sign on as an administrator to Azure AD and make the necessary changes to the script below and execute.

New-AzureADMSInvitation -InvitedUserEmailAddress "scranz@berntoso.com" -InviteRedirectUrl https://myapps.microsoft.com -InvitedUserDisplayName 'Sara Cranz' -InvitedUserMessageInfo $messageInfo -InvitedUserType member -SendInvitationMessage $true

 

image

 

PowerShell invitation process - Bulk upload

If a bulk user invitation process is desired, users and email addresses may be pasted into a .csv file and used for upload.

Example .csv file

clip_image004

More details about bulk upload here: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-code-samples

Note: make sure the .csv is in proper format or the script will fail.

clip_image005

 

Guest user invitation redemption process

Once the user(s) receive an invitation mail, they’ll see something similar to the image below. The user will then select “Get Started” from the email to begin the invitation redemption process.

image

 

Dynamic Groups

Once external users exist in a tenant, dynamic group memberships may be used to automatically assign users to group, for example, any user with @contoso.com may be dynamically assigned to Group A. Group A can also be assigned to SaaS applications or assigned to SharePoint Online/OneDrive sites, so as soon as a user is assigned to a group they’ll have immediate access to the app(s) assigned to it.

Dynamic group membership eases the management process of adding and removing users to applications. Simply assign a group to the application permission and use dynamic group rules to automatically assign and remove users. You can even use attributes such as employeeId, mail, or companyName as attributes to look for, however there are many more attributes to choose from and depending where the users originates from, you may want to get creative.  Finally, for applications that support provisioning, guest users may be automatically provisioned and provisioned to SaaS applications which provides full user lifecycle management.

For more details about Azure AD Dynamic Groups please visit: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-groups-dynamic-membership-azure-portal

 

Azure AD B2B Licensing

Certain Azure AD B2B scenarios have licensing implications and the following site addresses licensing scenario best: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-licensing

For example, even though users may have the role of guest or member, depending on where they originated from, additional licensing may be needed. For example, if a user is invited that is from a subsidiary, Azure AD Premium would be required for that user, however if the user was invited from an external partner, they may be covered under the Azure AD B2B license.

Refer to the link above to make sure you have a clear understanding of which licenses are required and when.

 

Pay close attention to the following please:

If invited users already have an existing O365 or Azure AD tenant for their own, the first user added will need to go through the invite redemption process.  After which, the external user that belongs to the partner tenant can now add users can from their tenant, and those users will not need to go through the invite redemption process.  This is only true if there are only Azure AD tenants in play.

However, if users are external to Azure AD (e.g. outlook.com, gmail.com, AD on prem, or another email address) the first user added to the tenant can invite other users, however those users still need to go through the redemption process as there are no accounts for those users that exist in an Azure AD tenant of their own.

In summary, any user that doesn’t already have an Azure AD tenant for their org and is invited to Azure AD, still needs to go through the invitation redemption process.

 

image

 

Guest user invitation redemption process – continued

Because I invited a user that is not part of an existing Azure AD tenant, they’ll need to run through the account setup process as shown below. If the user already resides in an Azure AD tenant of their own, they would simply sign on and access application assigned to them.

image

 

Here the user associates a password with their account (under the covers a new Azure AD tenant is created for berntoso.com and users are added to it).

image

 

The user will receive a verification code in their inbox they’ll need to use to finish.

image

 

Once the verification code is verified the account is created in the Azure AD tenant:

image 

 

Once the redemption process is completed they’re taken to the URL that was provided in the invitation process the administrator performed (e.g. myapps.microsoft.com) and the user now has access to SaaS applications in your tenant as shown below.  Send users links to SharePoint Online, Dynamics, Power BI, OneDrive, etc. for direct access to those applications.

image

 

 

User details 

By navigating to Azure AD and locating the user we see the following:

Does the user have to be a member? No, they can be a guest as well and still be able to invite users from their tenant.

image

 

 

Application/Group Assignments

Dynamic groups may also be utilized to automatically add users to a group or groups as well as a group may be assigned to integrated SaaS applications to provide SSO or even provisioning. For example, I have a dynamic group assigned to OneDrive and configured to automatically add any user with the @berntoso.com mail address:

image

 

Any user that has the @berntoso.com email address is automatically assigned to the “Berntoso User Group” which is already associated with a SaaS application.

image

 

To modify what type of permissions guest users, have within your Azure AD tenant navigate to Azure Active Directory and select “User settings”. There we see external user settings as shown in the image below.

For more details, please visit: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-delegate-invitations

image

 

Converting the user type, i.e. Guest/Member

Although this is optional, users may be converted from guest to members and vice versa using the “UserType” attribute (although they’re one in the same and typically only changed for identification purposes, e.g. external user vs subsidiary user). If guest permissions are limited, see image above, you may want to designate certain users as members, so they can add and/or invite users.

Add the user as a member type instead of a guest. If the user is already a guest, you can promote users to member using the following PS command:

Set-MsolUser -UserPrincipalName user_contoso.com#EXT#@contoso.com -UserType Member

For additional details about user properties and UserType scenarios please visit: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-user-properties

 

Azure AD B2B custom admin and self-service portal

By utilizing Microsoft Graph, organization may create their own Azure B2B admin and self-service portals. For example, the invitation redemption process may be customized to have the user fill out additional fields such as company name, city, state, phone, etc. Those fields will be populated in the corresponding fields in Azure AD for the user. In additional, users must agree to your terms of service before requesting access.  The options for the self-service portal are endless and may be developed to meet your organizations requirements.

 

The sample admin portal provides the ability to add different partner domains as shown below:  

image

 

Users who request access via the self-service portal can either be auto approved or added to an a queue for approval as shown below. Admins will then approve or deny the user access.

image

 

For each of the partner domains, additional settings may be added as shown below. I my example below, I auto approve users who sign in with @berntoso.com, users are added as members (not guests), and they’re automatically assigned to groups (dynamic groups may be used instead of group assignments) the admin experience customizable by you and a developer.

image

 

For more details on creating and customizing an Azure B2B admin and self-service portals, including downloading a sample portal please visit: https://github.com/Azure/active-directory-dotnet-graphapi-b2bportal-web

 

Additional developer resources

Create Invitation: https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/invitation_post

Invitation Manager: https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/resources/invitation

Azure Active Directory B2B collaboration API and customization: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-api

Delegate invitations for Azure Active Directory B2B collaboration: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-b2b-delegate-invitations

 

Conclusion

We covered a lot of details about Azure B2B, whether you’re looking to invite business partners or employees from subsidiaries, Azure AD has your collaboration scenarios covered.

SfBMac: Cannot connect to EWS after enabling EWS Access Policy

$
0
0

In a recent support case the Skype for Business Mac client wasn't connecting to Exchange Web Services (EWS) after the EWS Access Policy was configured with the following cmdlets:

Set-CASMailbox -Identity brick@borderlands.lab -EwsApplicationAccessPolicy EnforceAllowList -EwsAllowOutlook $true -EwsAllowMacOutlook $true
Set-CASMailbox -Identity brick@borderlands.lab -EwsAllowList @{add='UCWA/*', 'OC/*', 'OWA/*'}
https://technet.microsoft.com/library/bb125264(v=exchg.160).aspx

Get-CASMailbox -Identity brick@borderlands.lab | fl Name,EwsApplicationAccessPolicy,EwsAllowOutlook,EwsAllowMacOutlook,EwsAllowList

EWS was working except on Skype for Business Mac, after reviewing the logs the issue was that SfB Mac user agent is SfBForMac.
To fix this we simply add SfBForMac to the EwsAllowList with:

Set-CASMailbox -Identity brick@borderlands.lab -EwsAllowList @{add=’SfBForMac/*’}

Please note that the previous example was only for a test user, we can also configure it on the Organization Level:

Set-OrganizationConfig -EwsApplicationAccessPolicy EnforceAllowList -EwsAllowOutlook $true -EwsAllowMacOutlook $true -EwsAllowList @{add=’SfBForMac/*’,'UCWA/*', 'OC/*', 'OWA/*'}
https://technet.microsoft.com/library/aa997443(v=exchg.160).aspx

Get-OrganizationConfig |fl Name,EwsApplicationAccessPolicy,EwsAllowOutlook,EwsAllowMacOutlook,EwsAllowList

What’s new for US partners the week of December 18, 2017

$
0
0

Find resources that help you build and sustain a profitable cloud business, connect with customers and prospects, and differentiate your business. Read previous issues of the newsletter and get real-time updates about partner-related news and information on our US Partner Community Twitter channel.

Subscribe to receive posts from this blog in your email inbox or as an RSS feed.

Looking for partner training courses, community calls, and information about technical certifications? Read our MPN 101 blog post that details your resources, and refer to the Hot Sheet training schedule for a six-week outlook that’s updated regularly as we learn about new offerings. To stay in touch with us and connect with other partners and Microsoft sales, marketing, and product experts, join our US Partner Community on Yammer.

Top stories

Learning news

MPN news

Stay informed and engaged with US Partner Community & Microsoft

New Solution Area Technical Communities

New partner webinars and events available now

US Partner Community partner call schedule

Community calls and a regularly updated, comprehensive schedule of partner training courses are listed on the Hot Sheet.

Alta disponibilidad en Máquinas Virtuales.

$
0
0

Hola:

 

Hablando de alta disponibilidad en el centro de Datos en sitio, regularmente la infraestructura contempla elementos de redundancia o “fail over”, ya sean ruteadores de red, ruteadores de fibra, los servidores tienen 2 tarjetas de internet, 2 tarjetas de fibra, 2 fuentes de poder, así mismo las unidades de almacenamiento tienen 2 unidades de procesamiento con 2 tarjetas cada una para lograr la alta disponibilidad.

 

Al hablar de alta disponibilidad en la nube, manejamos el mismo concepto de tener múltiples recursos para lograr redundancia y evitar tiempos caídos de servicios, con una gran ventaja, los recursos físicos disponibles en los centros de datos de Azure que nos permiten tener más redundancia debido al volumen de equipos. Como cualquier equipo físico en el planeta, pueden enfrentar fallas y tiempos muertos, que puede causar que las máquinas virtuales radicando en ellos no estén disponibles temporalmente. Vamos a ver cómo podemos minimizar el impacto de estos eventos en nuestros servicios corriendo en Máquinas virtuales y así obtener el nivel de servicio de 99.95% descrito en  SLA for Virtual Machines.

 

¿Como tomamos ventaja de estos recursos para nuestras máquinas virtuales?

Al proveer el mismo servicio en múltiples máquinas virtuales en Azure, estamos tratando de obtener alta disponibilidad y minimizar los tiempos fuera de servicio. Sin embargo, Microsoft no sabe que es lo que está corriendo en las máquinas virtuales y podrían estar corriendo en el mismo servidor físico, o en servidores con una misma fuente de energía y mismo ruteador de datos de red. En caso de un mantenimiento, planeado o no planeado, esto podría resultar en una interrupción del servicio, lo cual se podría evitar.

 

La forma en que le indicamos a Azure que queremos alta disponibilidad para las funciones en nuestras Máquinas Virtuales, es a través de “Conjuntos de Disponibilidad’ o “Availability Sets”

 

Con el uso de “Availability Sets” en nuestras Máquinas virtuales le informamos a Microsoft que ese grupo de máquinas virtuales se utilizan para proveer el mismo servicio, y por razones de alta disponibilidad no deberían interrumpirse al mismo tiempo. Esto es muy importante, ya que como no tenemos control de en qué servidores físicos radicaran las maquinas, Microsoft las ubicara cumpliendo con estos requisitos.

 

Los “Availability Sets” es la manera de agrupar nuestras máquinas virtuales con el mismo servicio para minimizar el riesgo de interrupciones al mismo.

 

¿Como funcionan los “Availability Sets”?

Hagamos una homología con nuestro Centro de datos en sitio con el siguiente diagrama.

 

Tenemos 4 Racks con 7 servidores cada uno, los Racks 1 y 2 están alimentados de una fuente de Energía común diferente a la fuente de energía que alimenta a los Racks 3 y 4, en el caso de que falle la fuente de energía 1, fallarán los Racks 1 y 2, pero los Racks 3 y 4 seguirán levantados, por lo tanto, los servidores que tienen infraestructura física común constituyen un Dominio de falla o “Fault Domain”.

De la misma manera, digamos que los mantenimientos y actualizaciones de firmware a los equipos se agendan por Racks independientes, por lo tanto, al momento de efectuar un mantenimiento programado, los servidores del Rack 1 reiniciaran simultáneamente sin interferencia de los servidores en los Racks 2, 3 y 4, por lo tanto, cada conjunto de servidores que se les efectúa mantenimiento simultáneamente constituye un Dominio de Actualización o “Update Domain”.

 

A la hora de definir nuestros Availability Sets, podemos definir hasta 3 Dominios de falla y 5 Dominios de actualización. De modo que en una situación de “Falla” de hardware, una de cada 3 máquinas virtuales reiniciara, pero las otras 2 no serán afectadas, así mismo, en el caso de mantenimientos programados, una de cada 5 máquinas virtuales podrá reiniciar sin afectar a las otras 4.

 

Para la documentación oficial, favor de ir al enlace:

Manage the availability of Windows virtual machines in Azure

https://docs.microsoft.com/en-us/azure/virtual-machines/windows/manage-availability

 

Gracias y esperamos que sea de su agrado.

 

Saludos

 

Mariano Carro

Enviar correo a latampts

 

 

 

透過 Comtrade 軟體推出適用於 Nutanix 的 OMS 解決方案–正式發布

$
0
0

Comtrade 軟體推出適用於 Nutanix 的 OMS 解決方案已經正式發布了!!它讓使用者可以針對 on-prem 的 Nutanix 企業雲端進行監控、事件分析,和紀錄分析。它透過以下幾項拓展了 OMS 的功能:

  • 提供立即可用的警示訊息
  • 提供歷史性的 Nutanix 性能指標,例如:叢集 (cluster) /主機 (host) /儲存 (storage) /虛擬機的延遲、每秒輸入/輸出操作 (IOPS),和資源利用
  • 識別需要添加更多 SSD 儲存資源以維持低 I/O 延遲的情形
  • 及時識別負載很多或很少虛擬機之 Nutanix 主機
  • 作為 Nutanix 紀錄和事件收集與分析的中心點,以便在控制器虛擬機間關聯所有的 Nutanix 紀錄檔
  • 識別所有支援 Redfish 標準的主機,並在受監控環境中提供硬體傳感器的健康狀態

Comtrade 軟體推出適用於 Nutanix 的 OMS 解決方案,包含以下六個:

  • Nutanix 叢集
  • Nutanix 硬體
  • Nutanix 硬體傳感器
  • Nutanix 儲存
  • Nutanix 虛擬機
  • Nutanix 紀錄和事件分析

每個解決方案可以監控分析各自負責的 Nutanix 區域。若您部署它們全部,就可以覆蓋到所有 Nutanix 的監控與分析。

需要 Comtrade 數據分析軟體才能使用以上的解決方案。您可以由此連結獲得功能完整的 45 天的試用版,無需任何自動更新,也不用承擔任何義務。您也可以在 Azure Marketplace 找到這些解決方案。

這些解決方案慧定期更新,若要查看最新的更新細節,請至 Comtrade 部落格

 

 

Dynamics の中小規模顧客にフォーカスしたコミュニティ~DIRECTIONS ASIA 3 月中旬バンコク開催のご紹介【12/19 更新】

$
0
0

Microsoft Dynamics を取り巻く環境はこれまで以上に速く変化しています。Directions Asia 2018 Conferenceに参加して、Microsoft Dynamics製品および関連 ISV製品の最新動向をチェックし、アジア太平洋地域で事業を展開するMicrosoft Dynamics 中小規模顧客パートナーネットワークの中で関係を構築する方法をご紹介します。

現在、ASIA 2018のための出席者とスポンサーとして登録することができます。

 

出席者の登録と割引

出席者として登録するには、以下のボタンをクリックしてDirections Asia 2018登録ポータルにアクセスしてください。Super Early Bird 割引を受けるには早めに登録することをお勧めします。

 Super-Early-Bird Price:  2017/12/1から2017/12/31まで  299 USD
 Early-Bird Price:  2018/1/1から2018/2/28まで  349 USD
 Normal Price:  2018/3/1から当日まで  399 USD

 

参加者登録を行う

 

 

セッション

Directions ASIA 2018は、マイクロソフトからの新リリースが発表/リリースされる春の頃に開催されますので、カンファレンスでは以下の詳細な情報が得られます。

  • Dynamics NAV 2018 R2、Dynamics 365 Tenerife、Dynamics 365 Sales and Marketingアプリケーションを含むMicrosoft Dynamics製品ロードマップ

また、次のような他の重要なトピックについても説明します。

  • 拡張機能V2、Visual Code、Office 365 との統合、Power Apps、Power BI、Power FlowおよびDocker
  • さまざまなISVソリューションおよびサービス、およびマイクロソフトとのパートナー関係の合理化を目的としたプログラム。

さらに、この会議は、パートナーがネットワークを形成し、マイクロソフトと共同で成長する可能性を共有する機会となります。

 

参加者登録を行う

 


※ 本カンファレンスは日本マイクロソフトでは登録やその他のお問い合わせの受付を行っておりません。お問い合わせにつきましては、リンク先の英語ページから直接お問い合わせください。

 

Real Stories Of Teaching With Minecraft:EE

$
0
0

Last week I shared a blog post from the Microsoft Australia Education Team about the difference accessibility to devices makes for learners and this week I see they’ve shared an excellent webinar showcasing the work of four teachers from New South Wales.

As always, I refer you to the original blog post to read in detail:

Minecraft Education Edition takes the Australian Curriculum into a whole new world.

The webinar is definitely worth watching as well:

A few things really stick out to me from listening to these passionate educators:

  • The ability to integrate Minecraft:EE and game based learning across curriculum areas. Many of the examples shared link Science, Mathematics, Geography, History and English into a thematic unit.
  • Dispelling the notion that gaming and game based learning is only for boys. One of the teachers works in a girls school and describes very high levels of interest and enthusiasm for the activities.
  • Strong linking to the Australian Curriculum. This can, of course, be adapted for other curriculum in different countries, but it’s pleasing to see a solid pedagogical basis for the learning.
  • Some of the teachers who share explained this was the first time they were using Minecraft:EE and were nervous about it’s outcome. Nevertheless, they were prepared to give it a go, and demonstrated all the attributes of being a life long learner.
  • Exporting of models and maps designed in Minecraft:EE to 3D printers to take ideas from concepts to production. This is an important workflow that makes the learning “real” for many students.
  • The use of Microsoft OneNote Class NoteBooks to tightly plan and structure the lessons and outcomes that students were required to work towards. This can provide the framework for students and also confidence for teachers that there is a structure and direction to the lessons and it’s not always simply “playing”.
minecraft_5-1024x572.png

Using OneNote as a PBL Template for structuring activities in Minecraft

  • Digital Citizenship can be taught and “lived” through the playing of Minecraft.
minecraft3.jpg

Screenshot of a general store as part of the Gold Trail in Victoria, Australia

My Point Of View:

minecraft_4-1024x572

Student drawings of the “Code Agent” in Minecraft:EE

Hearing first hand from other educators about their fears, challenges and successes in the classroom is important for teachers. The webinar above provides some insights into the learning from four teachers and is worth listening to as they are pretty candid about what did and did not work. I am particularly interested in the discussion around teaching of Coding and Computational Thinking through Minecraft as this is something I’ve blogged about before and is particularly relevant in NZ with the changes to the Digital Technologies Curriculum Strands. From the original blog:

With the integration Code Builder in Minecraft Education Edition, Lynne Telfer’s students have been exploring ways to program their Code Agent powered by Microsoft Make Code. This requires students to access their computational thinking and develop a set of algorithms (set of instructions) for the Code Agent to create with structure blocks within Minecraft Education Edition. The Code Agent is a fantastic way to expose students to both visual programming and free coding in Java Script. Students experience the benefits of its application through the designing their own commands and solving efficiency challenges when building digital artifacts.

Again, have a look at the original blog post for more details and if you’re interested in giving Minecraft:EE a go in your classroom then perhaps check out this blog post about how to deploy and code in Minecraft.


インテリジェントなツールを求める人々に応える AI

$
0
0

[ブログ投稿日:2017年12月13日]

Posted by:アリソン リン (Allison Linn)

サンフランシスコで開催されたイベントでマイクロソフトのAI分野の技術開発を紹介したジョーディ リバス(左)とクリスティーナ ベア(右)(写真:ダン デロン (Dan DeLong))

 

インターネットの初期の時代を思い返してみましょう。おそらく、ウェブサイト、住所、電話番号などの特定の情報を見つけるために、あたかも電話帳のように検索エンジンを使っていたのでないでしょうか?

それ以来、テクノロジは大きく進化し、私たちのテクノロジに対する期待も大きく変化しています。今では、人々はよりインテリジェントな答えを求めています。たとえば、やろうとしているフィットネスプログラムの良し悪しについて知りたいかもしれませんし、最新のマーベルコミックスの映画が観るに値するのかを知りたいかもしれません。「お腹が空いた」という漠然な要求だけでお気に入りの検索ツールを使いたいこともあるでしょう。

人々がそのようなリクエストを行う時は、単にウェブサイトのリストを必要としているのではありません。たとえば、旅行先の町にあるおすすめのレストランなどを知りたいと思っているのかもしれません。あるいは、ひとつの話題に関する多くの視点を得るための多様な答えを求めているのかもしれません。どういう質問をすれば最適なのかを知るための助けを必要としている可能性すらあります。

水曜日にサンフランシスコで開催されたマイクロソフトのイベントで、マイクロソフト経営陣は、AI(人工知能)を使って微妙な情報や複雑な要求について人々を支援する検索エンジン Bing、インテリジェントアシスタントCortana、そして、プロダクティビティツール Office 365 における多くの技術的進化について発表しました。

これらの製品やサービスは、マイクロソフトが今まで行なってきた製品ライン全般にAIを統合するという取り組みの成果であり、エンジニアやコンピューターサイエンティストが情報の分類以外の分野でも AI を使って人々を支援しようとしていることの最新の例です。

「情報を見つけるという点で AI は大きな進化を達成しましたが、その情報を理解させることが真の課題です」とマイクロソフトの Artificial Intelligence and Research グループのパートナーデザインアンドプランニング担当プログラムマネージャーのクリスティーナ ベア (Kristina Behr) は述べています。

マイクロソフトのAI製品を統括するコーポレートバイスプレジデントのジョーディ リバス (Jordi Ribas) は、AIと聞くと人々はロボットや自動運転車を思い浮かべることが多いと言います。多くの人々が気づいていないのは、人々が毎日使用する検索エンジンや Office 365 などの製品でも、AIが日々の生活に明らかに有用な影響をもたらしているという点です。

「長年にわたり AI の社会への統合が進んできました。人々が長い間 AI を使ってきたためそれに気づかないこともあります」とリバスは述べます。

サンフランシスコのイベントで、マイクロソフトは、より充実した、有効性の高い情報を人々に提供することを目指したインテリジェントサーチにおける AIの最新成果を紹介しました。

これらの成果としては、写真の中にある情報を発見する支援を行うコンピュータービジョンと物体認識技術、機械学習を使ってコンテンツを読み、たとえば、いとこは家族の一員なのかと言った概念を理解するマシンリーディングなどがあります。

また、情報の見つけ方がよくわからない場合でもユーザーが情報を得られるようにするシステムの開発も進んでいます。たとえば、新しいデバイスの Bluetooth をオンにしたいとしましょう。新しいシステムは、ユーザーに対してデバイスのタイプや使用しているオペレーティングシステムなどの追加の情報を求めることができます。

Bing におけるもうひとつのAI による新たな技術進化は、主観的になりがちな検索要求に対して複数の視点を提供することを目標にしています。たとえば、Bing に「コレステロールは有害ですか?」と尋ねると、この質問に対する 2 つの異なる視点が示されます。これは、明確に答えが確定しない質問もあるという前提に基づいたマイクロソフトの開発努力の一環です。

リバスは、これらのインテリジェントサーチツールは、情報検索に対する人々の期待を満足させるための Bing における取り組みの一環だと述べます。人々はインターネットに、事実だけではなく、意見や分析も求めるようになっています。Bing はこのニーズに対応すると同時に、例えばアレクサンダー ハミルトンの生年月日と、「ハミルトン」が良いミュージカルか否かなど、ひとつの定まった答えがある事実と、複数の視点がある分析や意見を明確に区別することを目指しています。

「Bing において私たちが目指すものはウェブ全体から最適な結果を提供することです。最も漏れがなく、関連性が高く、信頼できる結果を提供したいと考えています。人々は、数式のように答えがはっきりしたもの以上を求めることがよくあります。このような質問に対する意見を明確化し、バランスの取れた客観的な形式で提供したいと考えています」とリバスは述べます。

この取り組みの一環として、マイクロソフトはソーシャルニュースアグリゲーションサイト Reddit との提携も発表しました。この提携により、Bing の検索結果に、Reddit での会話やコミュニティの視点を含めることができるようになります。

また、Reddit の著名なサービスである AMA (“ask me anything”) の結果も表示する計画も発表されました。Reddit は、バラク オバマ (Barack Obama) やビル ゲイツ (Bill Gates) などで AMA を実施しています。

Reddit の共同創業者、アレクシス オハニアン (Alexis Ohanian) は、Reddit には世界の他の場所では見つけられないユニークなデータが存在すると指摘します。ヒゲの手入れに最適なオイルからグルテンフリーのパンケーキの上手な作り方まであらゆる専門知識が存在します。さらに、投票によって有用な回答を上位に表示するコミュニティのシステムにより、価値のある情報の発見が容易になっています。

Reddit 共同創業者アレクシス オハニアン (写真:ピーター ダシルバ (Peter DaSilva))

 

逆に、マイクロソフトは Redditにはないスキルを持っているとオハニアンは述べます。すなわち、Reddit のデータの定量化と分析を行ない、Bing の検索結果を通じて人々に有用な情報として提供できる能力です。

総合すると、両社は今まではアクセスできなかった専門家のコミュニティによる優れた検索結果を提供できるようになるとオハニアンは考えています。

「複雑で微妙な質問にも対応できるようになりました」とオハニアンは述べます。

情報をどこでも、好きな形で

インターネット検索の過去を再び思い出してみてください。おそらくは、ほとんどの検索を机の上のコンピューターから行なっていたことでしょう。今では、人々はどこにいても情報を求め、手でタイプできない時や、目で画面を読めない時でも情報にアクセスしたいと考えています。

マイクロソフトのパーソナルデジタルアシスタント Cortana の機能の多くはBing 検索エンジンで実現されているため、Bing の進化により、キーボードやディスプレイがない環境でも人々が有用な情報を得るための助けが得られるとベアは述べています。検索結果をサマリーするマシンリーディングや追加質問をしてくれる会話型検索などはこのような環境で価値を発揮します。

これらの技術進化はウェブ検索だけに適用されるのではありません。

サンフランシスコのイベントで、マイクロソフトは、例えば帰宅途中に Cortana がメールを整理して、上司や妻からの重要なメールをサマリーしてくれるというデモを通し、Cortanaの機能紹介を行いました。この機能は、たとえば、個人メールがGmail、仕事のメールがOutlookといった混在環境でも機能します。

また、マイクロソフトは、スキルチェイニングという機能も紹介しました。これは、有用と思われる別のスキルを提案してくれる機能です。たとえば、Cortana を使ってイベントのチケットを予約すると、Cortana はカレンダーにそのイベントの予定を追加するよう提案することができます。

従来型のコンピューター以外のデバイスを使っている人々に対して有用な情報を提供するための最善な方法はまだ明らかではないとベアは指摘します。これが、マイクロソフトが、技術的課題とデザイン上の課題の両方からアプローチしなければならない理由です。
「最終的にどのような姿になるかは現時点ではまったくわかりません」とベアは述べています。

Office 365 の AI

インターネットが広く使われていた時、既にMicrosoft Word はスペルミスや打ち間違いを修正するためのスペルチェッカー機能を備えていました。

今日、人々は基本的スペルチェック以上の目的で Office 365 を使用しています。Word ドキュメントで言い回しを提案してもらったり、PowerPoint で自動的にプレゼンテーションをデザインしてもらったり、Outlook でメールを重要性の順でソートしてもらったりなどです。

Office のマーケティング担当ディレクター ロブ ハワード (Rob Howard) は、顧客が気づいていないかもしれないが、これらの便利な機能の多くは長きにわたりAIが支援してきたと述べます。

「人々に AI を活用してもらうという点で言えば、Office は最も重要な手段のひとつです。人々は AI を効率性向上、文章の品質向上、美しくインパクトがあるプレゼンテーションの作成などのために活用しています」とハワードは述べています。

これらの機能の多くは、スペルミスを修正したり、写真の配置をガイドしたりというちょっとした支援を人々に提供することで、長期的で多大な時間を節約することを目的にしているとハワードは指摘します。しかし、かつては AI 専門家を必要としていたような重要な作業にも AI ツールが適用されるケースも増しています。

水曜日の AI イベントで、マイクロソフトは、Excel の Insights 機能のプレビューを発表しました。この機能はAIの機械学習を活用して、Excel スプレッドシートのデータを分析し、ピボットテーブルや傾向を示すグラフなどの有用な情報を作成します。

「データにはきわめて大きな価値がありますが、その価値はデータから実際に洞察を抽出できてこそ生まれるものです」とハワードは指摘します。

Office 365 への最近の機能追加の多くがマイクロソフトの所有する大量のデータを活用することで実現されています。たとえば、文書を読んでいて知らない略語に出会ったとします。水曜日に開催された AI イベントでマイクロソフトは、Acronyms という新しい Word の機能をリリースする計画を発表しました。この機能は、大量の Office 文書とメールを分析することで、その組織特有の略語の定義を発見できるよう支援します。

Tap for Word という機能では、作業中の文書から離れることなく、関連する文書、スプレッドシート、プレゼンテーションを検索できます。これは、たとえば、数百ものメールや企業ウェブサイトを見ることなく、利益率のグラフをプレゼンテーションに追加できることを意味します。

また、メール内のアクションが必要とされる要素をハイライト表示し、外出中でも迅速に対応するための選択肢を提供するツールのリリース計画も発表しました。

これらの製品やサービスの多くは情報の処理にクラウドを利用しています。また、Office 365 はクラウド上で稼働しているため、ユーザーは定常的に改良された機能を利用することができます。

たとえば、Designer 機能が最初に PowerPoint で提供された時、何枚かの写真を取り込んでレイアウトを作成することができました。現在では、この機能は箇条書きリストを取り込み、ユーザーがアウトライン作成のステップにあることを理解し、これらのステップを強調したプレゼンテーションを作成します。

「クラウドによりマイクロソフトの開発ペースは大いに加速しています。エンドユーザーの皆様に多くの機能をより迅速にお届けできるようになっています」とハワードは述べます。

 

---

本ページのすべての内容は、作成日時点でのものであり、予告なく変更される場合があります。正式な社内承認や各社との契約締結が必要な場合は、それまでは確定されるものではありません。また、様々な事由・背景により、一部または全部が変更、キャンセル、実現困難となる場合があります。予めご了承下さい。

Script Wars: The Farce Awakens (part II)

$
0
0

Summary: Yesterday, we met a newly graduated IT professional, Rey Skyworker, as she discussed the ways of "The Farce." It was during this discussion that her instructor, Ben Kerberosie, discovered she had a natural gift in understanding how to implement some good practices into writing her scripts.

Today, we sit quietly (I mean all of you in the back too, no chattering and tossing about popcorn!), as Rey is about to embark on her new job.

The position: on the help desk. The company: Contoso Holly Jolly Hat Company.

She is being introduced to her new co-worker on the help desk, Jeremy Tinnison.

-----------------

Jeremy shook her hand.  "Welcome to Contoso, Rey, I'm Jeremy but everybody around here just calls me Tin."

She looked up. "Ok Tin. Interesting environment you have here. So what do you do here mostly?"

Ben looked over. "Go ahead Tin, let her know of the challenges you're having. She's well versed in Windows PowerShell and should be able to help you past some of your automation challenges."

Tin was about to speak when a small vacuum with googly eyes bumped into Rey's feet.

"What th…," she looked down as it rapidly began to try vacuuming her shoelaces. The battle to regain control of her feet was more than amusing. The small, football shaped object rolled about as if in disgust.

"That," muttered Ben, "is one of the experiments from R&D. It is the 'Trash Bagger 7.5' or 'TB-7' for short. It roams the office trying to find garbage, gum wrappers or bits of LAN cables. It occasionally makes mistakes like you just saw. Sometimes we find it picks up the odd power cord and unplugs systems. They are working on Release 8, which should resolve these unexpected issues."

Rey looked down, smiling at the rolling, blinking nightmare. "Silly thing."

"So!" burst out Tin, "Let me tell you about one of our current challenges. We have a Windows PowerShell script that creates a user in our Azure Active Directory environment. Most of the process is manual still for Office 365 for licensing. Our short-term issue is that we need to find a way to trap for errors in the script."

He showed Rey the initial script they used to provision users in Azure Active Directory.

$First=Read-Host 'Enter First Name:'

$Last=Read-Host 'Enter Last Name:'

 

Connect-AzureAd

 

$DisplayName=$First+' '+$Last

$Mailnickname=$First+$Last

$UserPrincipalName=$First+$Last+'@contoso.com'

 

$TempPassword='BadPassword4U!'

$PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile

$PasswordProfile.Password=$TempPassword

 

New-AzureADUser -DisplayName $DisplayName -GivenName $First `

-Surname $Last -AccountEnabled:$true -PasswordProfile $PasswordProfile `

-MailNickName $MailNickname -UserPrincipalName $UserPrincipalName

Ben stepped up. "Occasionally some of the new staff don't follow the instructions, and the script throws an error. We'd like to find a way to trap for the various errors in PowerShell. It may cause the script to stop, but we'd like to find a way to clean up or return codes back to calling scripts down the road."

Tin ran the script to demonstrate, by deliberately giving blank values for the first and last name.

Screenshot of PowerShell

"Rather than this appear," continued Tin, "we'd like to be able to trap for the individual types of errors in PowerShell whenever possible. Logging them should happen but we want to be able to action the script in certain scenarios."

"You can if you like," ventured Rey. "Access the $Error variable in PowerShell. It's not just a text array, it's actually an object which contains all the information on errors."

Rey stored away the value of the last error on the screen to view its properties.

$ErrorToSee=$Error[0]

"If we pipe this into Get-Member, we can see it has many properties, including one called 'Exception.'"

Screenshot of PowerShell

"The 'Exception' property contains the actual object with the exception value PowerShell caught. We can view it like this."

Screenshot of PowerShell

Tin's eyes lit up. "Oh! If I run that against Get-Member, will it expose more information?"

Tin piped the output to Get-Member to view the additional properties.

Screenshot of PowerShell

Rey looked at the output. "In this case, it's not a property we need but the method type that failed. We can use the GetType() method to pull this information out."

Screenshot of PowerShell

"From the screen, we can see the error type is 'ApiException' but we need the full name for it.  Fortunately, we can look for other members with 'Name' in their description."

Screenshot of PowerShell

Tin looked down at the results. "I'm guessing 'FullName' is just too obvious?"

Rey nodded. "Just add it on to the gettype() and you'll have your answer," as she typed quickly into the console.

Screenshot of PowerShell

"Now to use this, we just use a 'Try Catch' statement in PowerShell. This block of script's job is to literally 'Try it out' and put in errors to 'Catch for'. This allows us to write code to mitigate, report, or trap the errors."

Rey added a simple Try Catch statement around the existing block of code in the PowerShell script, using the identified error.

$First=Read-Host 'Enter First Name:'

$Last=Read-Host 'Enter Last Name:'

 

Connect-AzureAd

 

$DisplayName=$First+' '+$Last

$Mailnickname=$First+$Last

$UserPrincipalName=$First+$Last+'@contoso.com'

 

$TempPassword='BadPassword4U!'

$PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile

$PasswordProfile.Password=$TempPassword

 

Try

     {

     New-AzureADUser -DisplayName $DisplayName -GivenName $First `

     -Surname $Last -AccountEnabled:$true -PasswordProfile $PasswordProfile `

     -MailNickName $MailNickname -UserPrincipalName $UserPrincipalName

     }

Catch [Microsoft.Open.AzureAD16.Client.ApiException]

     {

     Write-Output 'A blank name was supplied.   Restart the script.'

     }

}

 

When they re-ran the script with the names supplied as blank, the results were far nicer.

Screenshot of PowerShell

Tin looked up. "Oh! I can trap for additional errors as well?"

Rey noted, "Just add an additional Catch statement for each unique error condition. I used Write-Output as an example, but you can put in any PowerShell code to deal with the errors."

Tin was excited! "What else can we do with this script? I want to make all of this seamless for the staff!"

Stay tuned for tomorrow's episode of "Script Wars," as Rey touches on more ways to make the script stronger with "the Farce."

Sean Kearney, Premier Field Engineer

Enterprise Services Delivery, Secure Infrastructure

 

 

Windows 10 のビルド 10240 において LSA の保護を有効化すると OS が起動できなくなる

$
0
0

こんにちは。Windows プラットフォームサポートの扇谷です。

今回は、LSA (Local Security Authority) の保護を有効化した場合に、OS が起動できなくなる不具合の情報についてご案内します。

LSA は、ユーザーのパスワードハッシュ等の資格情報を管理しております。
Windows 8.1/Windows Server 2012 R2 以降では、こちらの情報が盗まれないように LSA の保護を有効化することができます。
LSA の保護を有効化するには、以下のレジストリを設定して、OS を再起動することで行います。

 

キー:HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlLsa
値:RunAsPPL
種類:REG_DWORD
設定値:1

 

しかしながら、Windows 10 のビルド 10240 においては不具合により、上記設定により LSA の保護を有効化して、
OS を再起動した場合、OS の起動が行えなくなり、以下のように自動修復モードの画面となります。
事象が発生してしまった場合には、OS の再インストールが必要となります。

こちらの不具合は、Windows 10 のビルド 10240 (Windows 10 TH1/Windows 10 2015 LTSB) のみで発生し、
次にリリースされた Windows 10 のビルド 10586 (Windows 10 TH2) 以降では修正されております。
弊社製品の不具合によりご迷惑をお掛けして申し訳ございません。

 

 

 

 

 

 

 

 

どのビルドを利用しているかは winver コマンドを実行して、表示された画面にて確認可能です。

 

 

 

 

 

 

 

 

 

 

- 参考資料

Configuring Additional LSA Protection
https://msdn.microsoft.com/en-us/library/dn408187

Word 的研究人員功能讓你查詢、點擊、匯入一次完成,不需再切換視窗

$
0
0

「研究人員」功能讓論文撰寫更輕鬆,透過微軟 Bing 知識圖譜來查詢網路上相關資訊,可以直接在 Word 中查詢、點擊、匯入,就不需在多個視窗間重複切換。新增來源資料時,也會在論文書目清單中自動產生引用資料!

▲點擊「參考資料」中的「研究工具」

▲跳出查詢資料的畫面

▲在搜尋欄輸入想查詢的關鍵字

▲搜尋到需要的詳細資料

 

以上就是為大家提供的教學步驟,研究人員可以協助您尋找並將可靠來源的資料貼到您的論文裡,讓您事半功倍。快去試試這個簡易又實用的功能吧!

Oppdatering – Slik virker backup og gjenoppretting i Office 365

$
0
0

Opprinnelige innlegget, skrevet av Joakim Grepperud, produktsjef for Office 365, 28. april 2017. Her er en oppdatering av Eirik Christiansen, Teknisk løsningsrådgiver Collaboration 18.12.17.

 

Med Office 365 får arbeidsplassen verktøy som gjør det enkelt å jobbe sammen. Microsoft har nå mer enn 100 millioner brukere av Office 365 og 50 000 nye bedrifter tar i bruk tjenestene hver måned. For å kunne levere og drive tjenester i så stor skala så er det viktig at driftsrutiner rundt backup og gjenoppretting fungerer så smidig og sømløst som mulig. Office 365 har omfattende rutiner og mekanismer på plass for å sikre at data ikke går tapt på grunn av systemfeil, og et av tiltakene vi gjør er å ha flere kopier av dataene. Dette gjør det mulig for Microsoft å sikre at dine kundedata er tilgjengelig for deg minst 99,9% av tiden, og i praksis ser vi en oppetid i Office 365 på 99,98%.

Videre hender det at en bruker sletter et dokument eller en e-post som man senere innser at man likevel trenger. Med Office 365 har vi gjort det enkelt å la brukeren selv gjenopprette e-poster og dokumenter, samt hvordan man kan benytte oppbevaringsregler for å sikre at informasjon blir tatt vare på i en periode virksomheten selv bestemmer.

I denne artikkelen lærer du hvordan backup og gjenoppretting av e-poster og dokumenter virker, og hvilke konfigurasjonsmuligheter virksomheten har for å kontrollere dette selv.

 

Slik gjør Microsoft sikkerhetskopiering av mailbokser

Vi tar kontinuerlige sikkerhetskopier av mailbokser og sprer disse utover våre datasentre i Irland, Nederland, Østerrike og Finland. Dette gjør at Office 365 automatisk sikrer at mailboksen din virker dersom diskkræsj eller andre systemfeil skulle oppstå. Vi benytter Exchange Native Data Protection-teknologien for å oppnå dette. Tre av sikkerhetskopiene er satt opp med «høy oppetid»-konfigurasjon og den fjerde er satt opp som en «Lagged Database copy» for å sikre mot systemfeil som påvirker de tre første kopiene.

Dette gjør at Exchange Online er sikret etter alle kunstens regler, samtidig som dine data er replikert mellom forskjellige lokasjoner, det vil si at det alltid finnes flere kopier av datasettet noe som gir en høy grad av sikkerhet og trygghet.

 

Slik gjør Microsoft sikkerhetskopiering av dokumenter

Når det gjelder dokumenter og annen informasjon lagret i SharePoint Online og OneDrive for Business så foretas det også her flere sikkerhetskopier av dataene. Når en bruker laster opp et dokument så gjøres det en speiling av dokumentet til et annet område i datasenteret. Derfra foretas en asynkron «log shipping» til et annet datasenter i regionen (Europa for norske kunder). I tillegg til synkron og asynkron replikering, så foretas det backup av kundedata flere ganger daglig. Disse dataene blir også asynkront replikert til et annet datasenter.

Microsoft har til enhver tid backup av SharePoint/OneDrive for de siste 14 dagene.

 

Egendefinerte oppbevaringsregler på mailbokser og dokumenter

Office 365 inkluderer funksjonalitet som lar virksomheten selv sette regler på hvor lenge e-poster og dokumenter i Office 365 skal tas vare på, uavhengig av hva brukeren gjør eller om brukerkontoen slettes. Dette kalles oppbevaringsregler (retention tags) og er en del av «Sikkerhet og Samsvar» i Office 365 adminportal. Med oppbevaringsregler kan man spesifisere hvem reglen skal gjelde for og hvor lenge dataene skal tas vare på. Eksempelvis kan man lage en oppbevaringsregel som sier at alt innhold i alle brukeres OneDrive for Business-biblioteker skal tas vare på i 10 år etter sist endret dato, også dersom brukeren sletter dokumenter. Tilsvarende regler kan også defineres for flere tjenester i Office 365.

 

 

 

 

 

 

 

 

 

 

 

 

Skjermbilde fra Office 365 adminsenter. Dette er områdene man kan definere oppbevaringsregler på, og hvem reglene skal gjelde for.

 

 

 

 

 

 

 

 

 

 

 

 

Det er et enkelt grensesnitt for å definere hvor lenge informasjon skal tas vare på.

 

Gjenoppretting av e-poster og dokumenter

Dersom en bruker sletter en e-post eller et dokument, så kan brukeren selv gjenopprette disse selv. Det er mulig å kontakte Office 365 Support for å be om gjenoppretting av hele dokumentbibliotek i SharePoint Online og OneDrive for Business dersom det skulle være nødvendig, for eksempel dersom man er utsatt for kryptovirus.

 

Slik fungerer selvbetjent gjenoppretting

Dersom en bruker sletter en e-post så er første steg å sjekke søppelkassen til mailboksen. Alle e-poster blir liggende her helt til brukeren selv tømmer søppelkassen, eller til den tømmes automatisk med en sletteregel definert av virksomheten selv. Dersom en e-post er slettet fra søppelkassen så har brukeren 14 dager på seg til å gjenopprette e-posten før den slettes permanent. Dette gjøres fra søppelkassen tilgjengelig på https://mail.office365.com. IT-administrator kan velge å øke denne perioden fra 14 til 30 dager.

Dersom en bruker sletter et dokument i Office 365 så kan det gjenopprettes inntil 93 dager etter slettingen. Dette gjøres ved at brukeren går til søppelkassen i OneDrive for Business/SharePoint Online og gjenoppretter dokumentet derfra. Dersom dokumentene er synkronisert til din PC så vil du også finne dokumentet i søppelkassen til Windows.

Dersom en bruker i løpet av disse 93 dager sletter filen også fra sin søppelkasse vil de havne i en administrativ søppelkasse der en administrator kan gjenopprette filene i 93 dager fra objektet opprinnelig ble slettet fra Office 365.

Har man aktivert oppbevaringsregler så kan virksomhetens Office 365 administrator gjenopprette e-poster og dokumenter som har blitt slettet av brukeren. Dette gjøres via en tilpasset søketjeneste i Office 365 adminportal. Eksempelvis kan man spesifisere et søk etter gitte nøkkelord i SharePoint Online, og legge inn filtre på søket som dokumentforfatter, tidsperiode, tittel og mye mer.

 

Gjenopprett dokumenter angrepet av kryptovirus

Det er mange former for kryptovirus, som avgjør hvordan man kan gjøre en gjenoppretting av informasjonen. I mange tilfeller vil det være mulig å benytte versjonering til å gjenopprette dokumentet til en tidligere versjon. I andre tilfeller kan man benytte søppelkassen i OneDrive for Business eller SharePoint Online til å gjenopprette dokumenter. I enkelte situasjoner kan det være mer hensiktsmessig å ta kontakt med Office 365 Support for å be om gjenoppretting av et helt dokumentbibliotek til en tidligere dato. Det er viktig at dette gjøres raskt etter angrepet for å være sikker på at vi kan gjenopprette biblioteket.

Q1 2018 vil det også komme en ny funksjon i OneDrive for business, denne kalles Files Restore og funksjonen lar brukeren tilbakestille sin OneDrive i tid til et tidspunkt før den ble angrepet av Kryptovirus.

 

 

 

 

 

 

 

 

 

Skjermbilde fra den kommende Files Restore funksjonen i OneDrive for Business.

 

Adopsjon og adferdsendring

Med Office 365 vil dine brukere få tilgang til utvidede muligheter for selv å kunne hente tilbake slettede data, både via søppelkassen i OneDrive og SharePoint, via den kommende Files Restore funksjonen i OneDrive og slettede elementer mappen i Outlook. For mange vil dette være en ny måte å jobbe på som dine brukere trenger å bli gjort oppmerksom på så vel som opplært i.

Også for de som leverer tjenestene i bedriften vil dette være en adopsjon til en mer moderne IT plattform hvor man produserer, deler, samhandler og lagrer på en ny måte, IT vil gå fra en rolle der de tidligere hadde et tydeligere eierforhold til dataene som ble lagret i bedriftens systemer til en ny rolle som fasilitator der man skal besørge gode verktøy med sikkerhetsmekanismer som dekker brukernes og bedriftens behov.

 

Videre anbefalinger og nyttige lenker

Backup i Office 365 krever et annet tankesett enn tidligere.

  •  I en Skytjeneste som Exchange Online vil det være mindre behov for backup enn tidligere. Exchange Online er sikret med opptil flere versjoner av dataene på flere lokasjoner. Dette gjør så dine data vil være sikret selv i det utenkelige tilfellet av en totalkollaps i et av våre datasenter.
  • Tjenester som Groups, Teams og Planner bruker også Exchange og SharePoint funksjonalitet, en total restore av disse tjenestene vil være vanskelig med tradisjonelle backupverktøy.
  • Brukerne har større muligheter for å hente tilbake sine slettede data, samtidig som mekanismer som oppbevaringspolicyer kan settes på for å minske muligheten for at en bruker ved en feil kan slette kritiske data.

 

Har du spørsmål så bistår Office 365 Support gjerne med veiledning i hvordan man utfører forskjellige operasjoner. Du kommer i kontakt med support via Office 365 adminportal. Sjekk også ut Microsoft Tech Community der kan du delta i diskusjoner og stille spørsmål om alle temaer relatert til Office 365.

Nyttige lenker:

Office 365 adminportal: https://portal.office.com

Konfigurer oppbevaringsregler: https://protection.office.com/#/retention

Dokumentasjon om oppbevaringsrelger: https://support.office.com/en-us/article/Retention-in-the-Office-365-Security-Compliance-Center-2a0fc432-f18c-45aa-a539-30ab035c608c?ui=en-US&rs=en-US&ad=US

Microsoft Tech Community: https://techcommunity.microsoft.com

Mer detaljer om sikkerhetskopiering og kryptovirus: https://www.microsoft.com/en-us/download/details.aspx?id=53560

 

Video med demo av løsningen:

VM からの送信方向の通信量 (Data Transfer Out) について

$
0
0

こんにちは。Azure サポートチームの比留間です。皆様ご存知の通り、Azure の仮想マシン (VM) から、インターネットまたは他のデータセンターに出ていく通信に対しては料金が発生いたします。今回は、VM の送信方向の通信量について検討してみましょう。


私の通信量多すぎ?

よくあるご質問

Q: VM からの通信量が予想よりも多く記録されていました。どのような通信をしていたか、過去の内訳を調査することは可能ですか?

A: 残念ながらできません。

 

 

 

 

 

 

… と、ここで終わってしまうと、第3部完!ひたすら残念なだけの記事になってしまいますので、もう少し説明を補足します。

少し堅い話をすると、Azure を始めとする多くのパブリッククラウドサービスでは、お客様と事業者 (Microsoft) との間で責任範囲を分担するモデルを採用しております。大まかに分けると IaaS のレイヤーで提供される VM については、Azure 側は基盤となるハードウェア環境に責任を持ち、OS のレイヤー以上の構成は、お客様にて管理をいただくという区分になります。

つまり、VM 内部の OS 以上のレイヤーで行っている通信内容はお客様の持ち物、ということになり、Azure 側からはデータプライバシーの観点からも原則的に、お客様の VM の通信に干渉したり、通信内容や宛先を記録することは行っておりません。

従いまして、通信量の内訳を過去に遡って調査することは、私どもサポートチームでもご要望にお応えすることができません。しかし、現在進行形で予想以上のトラフィックが発生している、ということであれば、以下のような手法で手掛かりをつかむことが可能です。

1. VM 内部でパケットキャプチャーを採取する方法

原始的な方法ですが、OS が行っている通信内容を直接取得することができ、かつ、追加料金も発生しない手法です。Windows OS と Linux OS とで、採取方法の例をご紹介します。

Windows  の場合:

Message Analyzer や Wireshark などのパケット採取用ツールを使用してももちろん差し支えございませんが、すぐにインストールすることが難しい場合、OS の組み込み機能で簡易なパケットキャプチャーを実施することも可能です。

手順の例:

◇ ログの採取開始:
以下の説明では、仮に C:temp を情報採取フォルダーとしていますが、実際の運用環境に合わせて、十分な空き容量があり、かつ、ストレージの I/O を圧迫しない保存場所を選択してください。

以下の各コマンドを、管理者権限のコマンドプロンプトにて順次実行します。
これにより、ログの採取が開始されます。

cd C:temp

netsh trace start capture=yes traceFile=C:tempNetTrace.etl

 

◇ ログの採取停止:
しばらく情報を採取した後、以下の各コマンドを、各仮想マシンの管理者権限のコマンドプロンプトより順次実行します。
これにより、ログの採取が停止されます。

cd C:temp

netsh trace stop

採取された NetTrace.etl がパケットキャプチャーです。弊社製 Network Monitor や、Message Analyzer で内容を読み取ることが可能です。

Linux の場合:

一例として以下のような tcpdump コマンドでパケットキャプチャーの実施をいただくことができます。Ctrl+C で停止するまでパケットキャプチャーが継続します。

例:

sudo tcpdump -s0 -i any -n -w outfile.pcap

(コマンドの詳細については、tcmpdump のマニュアル (man) をご参照ください。
また、データの出力先や、パラメーターについては、お客様の環境に合わせて調整いただきますようお願いいたします。)

取得された .pcap ファイルは、Wireshark で内容の分析が可能です。

2. Network Watcher のパケットキャプチャー機能を使用する

VM 内部でパケットキャプチャー採取のための操作を実施することが難しい場合は、Network Watcher のパケットキャプチャー機能を使用する方法がございます。Network Watcher の拡張機能が VM にインストールされ、取得されたパケットキャプチャーをストレージアカウント上に出力させることができます。

手順と価格については以下の技術情報をご参照ください。

ポータルを使用して Azure Network Watcher でパケット キャプチャを管理する
https://docs.microsoft.com/ja-jp/azure/network-watcher/network-watcher-packet-capture-manage-portal

Network Watcher の価格
https://azure.microsoft.com/ja-jp/pricing/details/network-watcher/

 

3. Log Analytic の Wire Data 2.0 (プレビュー) を使用する

パケットキャプチャーの取得と分析とは少し毛色が異なるアプローチとして、Wire Data 2.0 を使用するという方法もございます。本稿作成時点でプレビュー段階であることに加え、インストールの手順等は必要になりますが、パケットキャプチャーの解析の知識がなくとも、通信先の IP アドレス等を図示して把握することが可能になります。

機能の紹介については、以下の技術情報をご参照ください。

Log Analytics の Wire Data 2.0 (プレビュー) ソリューション
https://docs.microsoft.com/ja-jp/azure/log-analytics/log-analytics-wire-data

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

Office 365 Workshop Links – December 2017

$
0
0

This is a link fest  for the items discussed during a recent Office 365 workshop delivered in an unusually tropical Calgary last week.  Calgary +12 in December is always a win!!

Posting the links here since they will be available to all of the attendees, and thought that others may also find them useful/interesting.

PowerShell Tips And Tricks

Start with these three articles:

How To Maximize Exchange Administrator Productivity With PowerShell–Part 1

How To Maximize Exchange Administrator Productivity With PowerShell–Part 2

How To Maximize Exchange Administrator Productivity With PowerShell–Part 3

For all PowerShell posts, please review this tag.

Office 365 Public Roadmap

http://roadmap.office.com/en-us

Office 365 Datacentre Map

https://o365datacentermap.azurewebsites.net/

Current Exchange 2013 Cumulative Update

Exchange 2013 CU18

Current Exchange 2016 Cumulative Update

Exchange 2016 CU7

Exchange Online

Overview of inactive mailboxes in Office 365

Manage inactive mailboxes in Office 365

Exchange Hybrid

Office 365 Mailbox Migration – Target Mailbox Doesn‎'t Have An SMTP Proxy Matching

Delivery Failed From Office 365 Mailbox To On-Premises Exchange Mailbox

Office 365 Exchange Hybrid Deployments Busting The Autodiscover Myth

Office 365 Autodiscover Lookup Process

Users in a hybrid deployment can't access a shared mailbox that was created in Exchange Online

Cross Premises Shared Mailbox Support

Planning an Exchange hybrid deployment.  This page has the support statement around what cross-premises permissions are supported.  The below is from December 2017. Note that the cross-premises permission support has recently change.  Please review the previous versions of this post to compare the older support statement.

  • Hybrid deployment requirements Before you configure a hybrid deployment, you need to make sure your on-premises organization meets all of the prerequisites required for a successful deployment. For more information, see Hybrid deployment prerequisites.
  • Exchange ActiveSync clients  When you move a mailbox from your on-premises Exchange organization to Exchange Online, all of the clients that access the mailbox need to be updated to use Exchange Online; this includes Exchange ActiveSync devices. Most Exchange ActiveSync clients will now be automatically reconfigured when the mailbox is moved to Exchange Online, however some older devices might not update correctly. For more information, see Exchange ActiveSync device settings with Exchange hybrid deployments.
  • Mailbox permissions migration  On-premises mailbox permissions such as Send As, Full Access, Send on Behalf of, and folder permissions, that are explicitly applied on the mailbox are migrated to Exchange Online. Inherited (non-explicit) mailbox permissions and permissions granted to objects that aren’t mail enabled in Exchange Online are not migrated. You should ensure all permissions are explicitly granted and all objects are mail enabled prior to migration. Therefore, you have to plan for configuring these permissions in Office 365 if applicable for your organization. In the case of Send As permissions, if the user and the resource attempting to be sent as aren’t moved at the same time, you'll need to explicitly add the Send As permission in Exchange Online using the Add-RecipientPermission cmdlet.
  • Offboarding  As part of ongoing recipient management, you might have to move Exchange Online mailboxes back to your on-premises environment.For more information about how to move mailboxes in an Exchange 2010-based hybrid deployment, see Move an Exchange Online mailbox to the on-premises organization.

    For more information about how to move mailboxes in hybrid deployments based on Exchange 2013 or newer, see Move mailboxes between on-premises and Exchange Online organizations in hybrid deployments.

AD FS

(note the space between AD and FS)

Install AD FS 2016 for Office 365 – first post published.  Additional ones to follow.

How To Install AD FS 2016 For Office 365

Installing AD FS 2012 R2 For Office 365 – Step By Step series of 3 posts:

  1. Install ADFS

  2. Install ADFS Proxy

  3. Leverage ADFS with Office 365

ADFS 2012 R2 Extranet Account Lockout Protection

Directory Synchronisation

Plan to install Azure Active Directory Connect 1.1  - older solutions are depreciated and will exit out of support in early 2017.

Still Running DirSync and AAD Sync–Really Time To Update

DirSync release announcement of Password Sync.

Managing Directory Synchronisation – Notes From The Field

List of Attributes that are Synced by the Azure Active Directory Sync Tool

How To Run Manual DirSync / Azure Active Directory Sync Updates

DirSync: How To Switch From Single Sign-On To Password Sync

Modern Authentication

Authentication changes are available in Office 2013 and Office 2016.  The solution has been called Modern Authentication.

This was first announced at MEC 2014 and earlier on the Office blog.  The November update on the topic is here.

Training Links

Microsoft Virtual Academy – multiple training videos

Office Technical Blog

Garage Series

Service Descriptions

Exchange Online Service Description – required reading!  Especially the limits section.  Read this now.  Do not be surprised…..

Updated IE Support Policy

Stay up-to-date with Internet Explorer

Tools

MXToolbox – useful site to test DNS records, SMTP blacklists etc.

Remote Desktop Manager 2.7 – download

Test Exchange Connectivity (EXRCA)

Teds Webtools

Network Tools (note that there is a hyphen in the domain name)

EOP

EOP Field Notes – Andrew Stobart’s excellent EOP blog

Anti-spam message headers

Enhanced email protection with DKIM and DMARC in Office 365

DMARC - What is it?

Save The Date - End Of Forefront Protection 2010 For Exchange -- T Minus 12 Months

Various Links

RBAC Primer

RBAC manager – Codeplex

Exchange Autodiscover

Remove Management Role Entries – since was unable to pipeline this in Exchange Online

Sysinternals Tools easy download  – http://live.sysinternals.com/

Manual Exchange Hybrid Configuration Steps – no longer supported but revel in the fact that you now have the HCW!

Friday Morning Rant – Premise and Tenents

Quick Tip: Is There A Shortcut URL To Download Azure AD PowerShell?

Random Links

Password Strength

This is what happens when you reply to spam – TED talk

Weather forecasting stone

The Register – UK IT News site

Cheers,

Rhoderick


Deploy Server 2016 – Part 2

$
0
0

Welcome everyone to Part 2 of my experiences deploying Server 2016. In Part 1, I gave a breakout of the stages of my deployment, and some of the details about our initial Group Policies. This post will follow the next steps of Stage One, where we build out the first 2016 server in the environment, apply the policies, and submit the server to our security team for scanning.

Picking up where we left off, I have taken the two policies that were created, Server 2016 Security Policy and Server 2016 Standard Policy, to the Server 2016 Organizational Unit (OU).


Before we start building our 2016 server, we will create a new Computer Object in the Build Staging OU in the Server 2016 OU structure. Do this by opening Active Directory Users and Computers and browse to the Server 2016-> Build Staging OU. Right click on Build Staging and choose New -> Computer. The reason we are creating our computer object in Active Directory before we do the actual creation is that we want to be certain that all our policies are applying to the servers from the moment they join the domain. This will help ensure the security of the servers and of our environment.


In the New Object – Computer wizard, enter the Computer Name of the new 2016 Server.



 

 

Since we've already applied the policies at a higher OU, they will flow down and apply to any objects in the sub OUs, so our newly created computer object will receive them with no additional work.

And now we move on to the creation of the server itself!

Following our plans for Stage One, we will install the server with the Desktop Experience. There are plenty of sites out there that detail the installation process for Server 2016 (like this one, and this one) so I won't duplicate the process here. I will call out the one part of the install that is important to us: the 'Select the operating system you want to install' screen, where it defaults to the Windows Server 2016 Datacenter edition. Don't be misled by the name, that is the CORE install, and not what we're looking for right now. We'll come back to that another day, but for right now, be sure to click on Windows Server 2016 Datacenter (Desktop Experience) before clicking next.


Now that our server is built and powered on, we'll log in as the local administrator and configure the server with a new IP address, name, and then we'll join the domain. Again, there are many resources detailing how to go through the initial configuration of a 2016 server (like here, here, and here), so I won't beat that to death here. I will show that I am going to join the CONTOSO.COM domain and my machine will be named CONTOSO16.


With the machine joined to the domain, we can verify that it is accepting the group policies by running the Group Policy Results Wizard from the Group Policy Management Console and check the Applied GPOs section.


Now that we have verified our new CONTOSO16 server has received the appropriate group policies, we can ask our security team to scan the server and verify that it meets all the requirements. For our test environment, they are testing to ensure all our policies are correct, and that they have been applied.

In our case, we have had several failures, listed here.

  • Issue 1: Ensure 'Audit PNP Activity' is set to 'Success'
  • Issue 2: Ensure 'Network access: Restrict clients allowed to make remote calls to SAM' is set to 'Administrators: Remote Access: Allow'
  • Issue 3: Disable IPv6 (Ensure TCPIP6 Parameter 'DisabledComponents' is set to '0xff (255)')

     

Issues 1 and 2 are new options for Server 2016, and Issue 3 has been a corporate preference. Microsoft recommends keeping IPv6 enabled, even if it is not being configured for use in the environment, but this is not a hard rule and in this customer's environment they have chosen to disable it.

Because Issues 1 and 2 are new, even when loading the newest templates into our PolicyDefinitions folder on our central store in %systemroot% SYSVOLsysvolCONTOSO.COMPolicies, if we open the Group Policy Management Console from any server below 2016 or a workstation with the Remote Server Administration Tools (RSAT) that is below Windows 10, we won't be able to see those new features. Also, any other new settings will only be visible from either a Windows 10 workstation or a 2016 server.

Here's the view from our 2012 R2 Domain Controller:



And here it is from a 2016 server:



Resolving the issue of turning off IPv6 is a little more complicated. As I've stated, Microsoft's official stance is to not turn it off, so there is no built-in policy to do so. Fortunately, there are a lot of great people out there and there are some template files which can be downloaded to help us resolve this issue. This link has an article on how to do so, and I'll summarize it here.

Once you have downloaded the IPv6Configuration.zip file, extract both the ADMX and ADML files. If your domain has a central store for the policies, the IPv6Configuration.admx files should be placed in the %SystemRoot%SYSVOLdomainPoliciesPolicyDefinitions folder as shown below. If your domain is not using a central store (fix that, they should be), then place the file in the %SystemRoot%PolicyDefinitions folder on each of your Domain Controllers or the machine where you are editing your group policies (i.e. a machine with the RSAT installed). Depending on how many Domain Controllers you have, you'll quickly realize that a central store is much more efficient.


Now place the IPv6Configuration.adml file in the %SystemRoot%SYSVOLdomainPoliciesPolicyDefinitionsen-US folder (replacing en-US as needed with your country code). Again, if you are not using a central store (which you should be), the file should be placed in the %SystemRoot%PolicyDefinitionsen-US folder (replacing en-US as needed with your country code).


Now that we've loaded in the IPv6 templates, we can see them in our Group Policy Editor. The new template will be visible in Computer Configuration -> Policies -> Administrative Templates -> Network -> IPv6 Configuration.


Once we open and
set the policy to Enabled, we'll see there are a lot of options in the drop-down menu. Since our customer wants IPv6 completely turned off, we'll choose Disable all IPv6 components to do this. This equates to the 0xFF option that the customer requested. While this policy specifies Windows 7 and Windows Server 2008, it still applies to newer versions of Windows Server and Windows clients.


After enabling and configuring the IPv6 policy, we will verify it has taken effect on our new server. From the command prompt on our CONTOSO16 server, we run the IPCONIFG /ALL command and verify that there is no setting for IPv6.


Now that we have verified that all our issues have been resolved, we can submit it back to the security team and have them run their scans for the official record. Our result this time: SUCCESS! Our server has passed all the necessary checks for the Development environment.

After completing our server build and passing the security checks, we can pass off our server to the VMWare administrators so that they can turn it into a template and deploy it to other environments for agent installation and further testing.

As our deployment project moves along, I'll post more details of the actions taken. Please post your comments, and if there's anything you'd like more details on, let me know!

[GDPRDemopalooza] Conditional Access

$
0
0

Basierend auf der GDPR / DSGVO Demopalooza hier das Demo zu Conditional Access (CA).

Wie immer aufgeteilt in zwei Bereiche: Das "Why" und das "How"

Why

 

Im Kontext von DSGVO ist es wichtig, dass PII [Personally Identifiable Information] (besonders) geschützt wird. Dazu gelten grade bei dem Zugriff auf diese, dass der Zugriff besonders sicher ist - wohingegen bei dem Zugriff auf "normale" Daten ggf. weniger strenge Vorschriften gelten. Hinzu kommt, dass basierend auf wer/wo/was entschieden werden sollte, wie und ob überhaupt auf die Daten zugegriffen wird.

Hierzu dient Conditional Access (CA), denn darüber kann gesteuert werden, von wo (Land, Netzwerk, etc) wer (welche Person) wie (sicheres|gemanagetes Device, Multi-Factor [y|n], Mobil [y|n], etc) auf die Daten zugreift. Dies sehen wir in der u.g. Demo.

@Interessierte Kunden: wir unterstützen gerne dabei Partner zu finden, die dieses Thema ganzheitlich unterstützen. Bitte sprechen sie hierzu ihren dedizierten Microsoft Ansprechpartner an

@Interessierte Partner: wir unterstützen gerne dabei in dem Thema die notwendige Readiness aufzubauen, so dass ihr das Thema bei und mit Kunden umsetzen könnt. Bitte sprecht dazu euren PDM/PTS oder Andreas oder mich direkt an.

How

Szenario: wir möchten die SharePoint Seiten zusätzlich absichern, so dass ein Zugriff von innerhalb des Corp-Netzes (bekannte externe IP Adresse/Range) ohne MFA, aber nur durch gemanagete Devices erfolgen kann, unsichere Geräte gar nicht und mobile Zugriffe (also über unbekannte Netze) nur über MFA erfolgen können.

  1. Dazu rufen wir (idealer Weise mit einem inprivate Browser und den passenden Credentials) die AAD Admin Seite auf.
  2. Dann - und an der Stelle hänge ich ab und an 😉 - rufen wir im Menu den Punkt "Conditional Access" auf - hierzu muss ggf. gescrollt (!!) werden.
    AAD Portal + Conditional Access auswählen
  3. Zu aller erst definieren wir ein "trusted Network", also unser Corp-Heimnetz: Unter "Manage" klicken wir auf "Named locations" und dort dann auf "+New location".
  4. Nun ist Kreativität gefragt: wir geben der Location einen sprechenden Namen und definieren die IP Range und bestätigen das Ganze mit "Create" [das dauert dann so 30s].
  5. Dann klicken wir auf "Policies" und anschließend auf "Add Policy" und erhalten folgenden neuen Dialog, in dem ich dem Kinde bereits einen Namen gegeben habe:
    Conditional Access - New Policy
  6. Jetzt kommen wir zu dem spannenden Teil, bei dem wir die Conditions definieren. Dazu klicken wir auf "Users and groups" und selectieren "all users". Die Zugriffsbeschränkung selber steuern wir nämlich nicht über die CA, sondern über die SharePoint Berechtigungen, die CA soll demnach für alle (Zugriff-berechtigten) User gelten. Anschließend nicht vergessen auf "Done" zu klicken.
  7. Nun bei "Cloud apps" als App "SharePoint Online" auswählen.
    Conditional Access - SharePoint Online selected
  8. Als nächstes müssen unter "Conditions" zwei Konfigurationen durchgeführt werden:
    1. unter "Device platform" muss bei "include" "configure" auf "yes" gestellt werden und dann "All platforms" selectiert bleiben/werden. ["done" nicht vergessen! 😉 ]
      Conditional Access - Device selected
    2. unter "Locations"
  9. Jetzt können wir unter "Access controls"->"Grant" definieren, was bei einem unter den obigen Bedingungen erfolgenden Zugriff passieren soll - wir stellen dort "Grant access", "Require multi-factor authentication" und "Require device to be marked as compliant" ein, "Require all the selected controls" lassen wir aktiviert. Und natürlich wieder "Select" nicht vergessen.
    Conditional Access - exclude corp net
  10. Last but not least stellen wir noch "Enable policy" von "off" auf "on" und drücken "Create"

 

Wenn nun der Mitarbeiter aus seinem Büro mit dem festen Desktop PC auf eine SharePoint Seite zugreift, passiert dies implizit - sofern er dies mit seinem Notebook von zu Hause aus macht muss er sich zusätzlich über MFA authentisieren.

Diese Demoanleitungen bieten einen Überblick zur Nutzung der jeweiligen Lösungen und Produkte im Kontext von DSGVO und stellt keine rechtlich bindende Aussage dar!

The December release of SQL Operations Studio is now available

$
0
0

This post is authored by Alan Yu, Program Manager, SQL Server.

We are excited to announce the December release of SQL Operations Studio is now available.

Download SQL Operations Studio and review the Release Notes to get started.

SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. To learn more, visit our GitHub.

SQL Operations Studio was announced for Public Preview on November 15th at Connect(), and this December release is the first major update since the announcement.

The December release includes several major repo updates and feature releases, including:

  • Migrating SQL Ops Studio Engineering to public GitHub repo
  • Azure Integration with Create Firewall Rule
  • Windows Setup and Linux DEB/RPM installation packages
  • Manage Dashboard visual layout editor
  • “Run Current Query with Actual Plan” command

For complete updates, refer to the Release Notes.

Migrating SQL Ops Studio Engineering to public GitHub repo

To provide better transparency with the SQL Operations Studio community, we have decided to migrate the Github internal branch to the public repo. This means any bug fixes, feature developments, or even test builds can be publicly viewed before an update is officially announced.

We made this move because we want to collaborate with the community to continually deliver features that our users want. This gives you the opportunity to see our changes in action to address your top voted issues. Visit our GitHub page and give us your feedback.

Azure Integration with Create Firewall Rule

Now let’s get into new features. A common issue when connecting to Azure SQL DB instances is that the connection can fail due to server firewall rules. This would require loading Azure Portal to configure firewall rules so that you can connect to your database, which can be inconvenient.

To speed up this process, we have enabled Azure Integration with Create Firewall Rule dialog. When your connection to an Azure SQL DB instance fails because of firewall settings, this dialog will appear, allowing the user to use their Azure subscription account to automatically configure the client IP address with the server. This retains the same experience as configuration on Azure Portal, except you can do it all through SQL Operations Studio.

Windows Setup installation and Linux DEB/RPM installation packages

We are always looking for new ways to improve the installation experience. With the December release, we have added Windows Setup installation to simplify installation on Windows. This wizard will allow the user to:

  • Select installation location
  • Select start menu folder
  • Option to add to path

In addition to Windows Setup, we have also added Linux DEB/RPM installation packages. These will add new ways for Linux users to download SQL Operations Studio for their choice of installation.

Feel free to try out these new installation experiences on our download page.

Manage Dashboard visual layout editor

In the initial release, there were not many options to customize the visual layout of the dashboards. With the December release, you can now resize and move your widgets by enabling the visual layout editor mode by clicking the pencil on the top right of the Manage Dashboard screen. This gives users greater control of their dashboard in addition to building their own custom insight widgets.

 

Run Current Query with Actual Plan command

Another new feature we have enabled is Run Current Query with Actual Plan, which is a command that will execute the current query and return the actual execution plan with the query results. This feature area is still in-progress as we work through the best UX for integrating this command directly into the query editor. While that design work is in-progress the functionality is still available via the Command Palette and you can define a keyboard shortcut if using this feature frequently.

Contact us

If you have any feature requests or issues, please submit to our GitHub issues page. For any questions, feel free to comment below or tweet us @sqlopsstudio.

7 Years of “Hey! Scripting Guy!” Holiday specials!

$
0
0

So for those of you who may not have realized it, 7 years ago on “Hey! Scripting Guy!” a little idea started out which was to have a fun PowerShell based Holiday special.

If you’re curious here are all 7 in Reverse order including the badly written and horribly sung Parody theme tunes.

Happy Holidays and Rock on with Windows PowerShell!

Sean, the Energized Tech
Microsoft Premier Field Engineer <- I love that I get to say that now! 🙂

Curly Blue and the Meaning of Scripting

Parody Theme Song - "Scripting Time is Here"

Rusty the Red-eyed Scripter

Parody Theme Song - "Scripting Guys, Scripting Guys"

Oliver Script

Parody Theme Song – “Code Glorious Code”

‘Twas the Night Before Scripting

Parody Theme Song – “PowerShell is Coming to Town”

It’s a Wonderful Shell!

Parody Theme Song – “Deck the Halls with Cmdlets”

How Mr. Finch Learned Scripting

Parody Theme Song – “The Scripting Song”

A PowerShell Carol

Parody Theme Song – “The Mr. Script Song”

Tar and Curl Come to Windows!

$
0
0

Beginning in Insider Build 17063, we’re introducing two command-line tools to the Windows toolchain: curl and bsdtar. It’s been a long time coming, I know. We'd like to give credit to the folks who’ve created and maintain bsdtar and curl—awesome open-source tools used my millions of humans every day. Let's take a look at two impactful ways these tools will make developing on Windows an even better experience.

1. Developers. Developers. Developers.

Tar and curl are staples in a developer’s toolbox; beginning today, you’ll find these tools are available from the command-line for all SKUs of Windows. And yes, they're the same tools you've come to know and love! If you're unfamiliar with these tools, here's an overview of what they do:

  • Tar: A command line tool that allows a user to extract files and create archives. Outside of PowerShell or the installation of third party software, there was no way to extract a file from cmd.exe. We're correcting this behavior 🙂 The implementation we're shipping in Windows uses libarchive.
  • Curl: Another command line tool that allows for transferring of files to and from servers (so you can, say, now download a file from the internet).

Now not only will you be able to perform file transfers from the command line,  you'll also be able to extract files in formats in addition to .zip (like .tar.gz, for example). PowerShell does already offer similar functionality (it has curl and it's own file extraction utilities), but we recognize that there might be instances where PowerShell is not readily available or the user wants to stay in cmd.

2. The Containers Experience

Now that we’re shipping these tools inbox, you no longer need to worry about using a separate container image as the builder when targeting nanoserver-based containers. Instead, we can invoke the tools like so:

Background

We offer two base images for our containers: windowsservercore and nanoserver. The servercore image is the larger of the two and has support for such things as the full .NET framework. On the opposite end of the spectrum is nanoserver, which is built to be lightweight with as minimal a memory footprint   as possible. It’s capable of running .NET core but, in keeping with the minimalism, we’ve tried to slim down the image size as much as possible. We threw out all components we felt were not mission-critical for the container image.

PowerShell was one of the components that was put on the chopping block for our nanoserver image. PowerShell is a whopping 56 Mb (given that the total size of the nanoserver image is 200 Mb…that’s quite the savings!) But the consequence of removing PowerShell meant there was no way to pull down a package and unzip it from within the container.

If you’re familiar with writing dockerfiles, you’ll know that it’s common practice to pull in all the packages (node, mongo, etc.) you need and install them. Instead, users would have to rely on using a separate image with PowerShell as the “builder” image to accomplish constructing an image. This is clearly not the experience we want our users to have when targeting nanoserver—they’d end up having to download the much larger servercore image.

This is all resolved with the addition of curl and tar. You can call these tools from servercore images as well.

 

We want your Feedback!

Are there other developer tools you would like to see added to the command-line? Drop a comment below with your thoughts! In the mean time, go grab Insider Build 17063 and get busy curl’ing an tar’ing to your heart’s desire.

Viewing all 34890 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>