Microsoft announced on July 25 that BP, a leading global energy company, has selected Microsoft Azure as part of its global cloud computing strategy.
The agreement will see BP move advanced workloads to Azure out of existing corporate data centers as part of the company’s modernization and transformation agenda – an agenda which is designed to deliver a sustainable step change in the company’s long-term performance.
By moving its proprietary data lake to Microsoft’s cloud platform, and utilizing Azure services, with state-of-the-art visualization and predictive tools, BP will enable rapid data analysis, with faster insights and decision-making.
“We have been impressed with Microsoft Azure Platform-as-a-Service, and its building block approach, particularly for our advanced workload requirements,” says Steve Fortune, Group CIO of BP.
“The Microsoft cloud provides the hyper scale needed for global businesses like BP to innovate quickly”, says Cindy Rose, Chief Executive of Microsoft UK. “Microsoft Azure will help…
Azure Information Protection allows administrators to define rules to classify corporate data, documents, emails, and other digitally stored information in the cloud, so that the information is protected automatically when the applicable criteria is met in an enforced configuration. Administrators can also set up the configuration so that end users with access to the originating documentation, can have the same options to do so on their own (when optional enforcement is permitted), based on suggestions when criteria matches are found within the sensitive data (e.g. structure of the numbers look like Social Security numbers, patient numbers, credit card numbers, wording in the document using terms like “confidential”, etc.).
Once protection labels are made, applied, and the data is protected, administrators can track the movement of the data and analyze where it flows, where it is stored, copied, shared, etc. This allows you to have a better understanding what kind of behaviors…
Determining how to efficiently manage logs for your large scale application(s) can be a daunting task. This is particularly important when running PaaS services, such as App Services, Web Jobs, and Azure Functions, where real-time access to application log files is not easily accessible. One effective solution leverages the log4net framework and a number of Microsoft Azure services for a surprisingly intuitive and scalable architecture. Most logging solutions allow for similar customizations. NLog is another popular logging framework that can also be customized to perform the same function. The following demo is done in C#.
How to Create a Custom Appender using Log4Net:
1) Create an Azure Service Bus Queue.
This step requires that you have an active Microsoft Azure subscription. You will have to first create a new Service Bus in the Azure portal. Azure Service Bus is a scalable and robust platform for…
At any point in time on any day of the week, Microsoft’s cloud computing operations are under attack: The company detects a whopping 1.5 million attempts a day to compromise its systems.
Microsoft isn’t just fending off those attacks. It’s also learning from them.
All those foiled attacks, along with data about the hundreds of billions of emails and other pieces of information that flow to and from Microsoft’s cloud computing data centers, are constantly being fed into the company’s intelligent security graph.
Microsoft invests about $1 billion in cloud security each year.
It’s a massive web of data that can be used to connect the dots between an email phishing scam out of Nigeria and a denial-of-service attack out of Eastern Europe, thwarting one attack for one customer and applying that knowledge to every customer using products including the company’s Azure computing platform,…
1: Build data-driven apps that learn and adapt
Applications show intelligence when they can spot trends, react to events, predict outcomes or recommend choices—often leading to richer customer experiences, improved business process, or addressing issues before they arise. The three key ingredients to creating an intelligent app are:
Ingest data in real time
Query across historical and real-time data
Analyze patterns and make predictions with machine learning
With Azure, you can make your applications intelligent by establishing feedback loops, and applying big data and machine learning techniques to classify, predict, or otherwise analyze explicit and implicit signals. Today, apps for consumers and enterprises can deliver greater customer or business benefit by learning from user behavior and other signals.
Pier 1 Imports launched a mobile-friendly pier1.com, making shopping online easier. It enabled the selection of delivery options like direct shipment, picking up products in the local store,…
Deploying to an Azure VM is a seemingly simple and easy process, but there are many steps, and thus many places to make mistakes. Recently I discovered that many, if not all, of the online resources detailing this process are either incomplete or severely outdated. Here I will outline the configuration and deployment process with up to date information, including instructions pertaining to the new Microsoft Azure Portal.
To get started, you will need a few things that this tutorial does not cover. This tutorial assumes that your Azure VM is already provisioned, deployed, and running IIS (Internet Information Services). The VM should be running Windows Server 2012 R2 or later. This also assumes that you have administrator access to the Azure Subscription associated with the VM.
Configuring IIS on the Virtual Machine:
First, we’re going to need to install some IIS features. To assist…
While cloud security continues to be a top concern, Microsoft recently shared insights from a survey that show overall concern has dropped significantly since 2015. They’re now at a stage where half of organizations contend the cloud is more secure than their on-premises infrastructure. In conversations Microsoft has with their customers and partners, they hear increasingly about how using the cloud improves an organizations’ security posture. As many organizations push forward on their digital transformation through increased use of cloud services, understanding the current state of cloud security is essential.
Maintaining a strong security posture for your cloud-based innovation is a shared responsibility between you and your cloud provider. With Microsoft Azure, securing cloud resources is a partnership between Microsoft and their customers, so it’s essential that you understand the comprehensive set of security controls and capabilities available to you on…
Office 365 customers, known as tenants within the configuration, all share Microsoft’s global datacenter infrastructure, which is composed of hundreds of thousands of servers located all over the world.
Within the tenant, customer data is housed in a region, based on their location and settings, as shown from the drop-down box displays.
Microsoft replicates customer data automatically across at least two datacenters at any given time to minimize against losses during any failover.
When you choose a region, you can see what data is where (when at rest) between the zoomed view and the details in the right margin:
Microsoft operates over 100 datacenters globally and continues to open more datacenter regions for Office 365 for business services. That being said, not every datacenter is used to host Office 365 and its services, but they do add to the available capacity (storage) and other…
The Execute R Script module in Azure Machine Learning is incredibly useful for manipulating data in ways that other modules do not cover. Its functionality can be further expanded by adding R packages that are not included in Azure ML by default. We will first show you how to get a list of packages that are already in your workspace and then how to add additional packages.
Checking Which R Packages are in Your Workspace
Create a new experiment, and place the following R code in an “Execute R Script” module:
Run the experiment. The output of the Execute R Script module will be a list of the available packages.
Adding R Packages
Before you can use the package in Azure ML, you need to set up the zip file structure in which ML expects the packages to appear. To do this, start by installing the…
There are some instances where you may wish to package a set of processes that ready for installation by managed accounts. Dell Boomi offers a solution to this problem with the use of Integration packs. Integrations packs allow you to offer a pre-packaged solution to your end users.
There are however a few things to consider before using integration packs:
You can only have a maximum of 100 processes in a single integration pack.
There are several components that cannot be added to an integration pack. Those elements included
Process Route Components
Processes that contain Process Route Components
Processes must first be published to a process library before being added to an integration pack. For more information about how to use process libraries in Dell Boomi, see this link
Integration pack management and multi-install integration packs are optional Dell Boomi AtomSphere features that must first be…