Welcome back to this small series of posts on PowerApps and CDS.

Now that we’ve looked at how to create your first CDS database in the previous article HERE, let’s see how we can use it by loading some sample data.

We can load a sub-set of data generated using a free online service like Fake Name Generator (http://www.fakenamegenerator.com/). Go to Order in Bulk and get a set of data generated based on the setting you configure on the wizard. You can get a maximum of 50,000 records, but for the purpose of this example de default 3,000 should suffice.

Now, with the data file in hand, let’s see how we can load these records into our Contact entity. Read the rest of this entry »

Advertisements

Common Data Service is an Azure based service that allows you to bring data together from the Dynamics family of products, along with other external sources. It is based on the concept of having data living together in one place, where it can be served to apps created through PowerApps.

As described in this previous article, you start by creating an environment. With that in hand, you are ready to create your first database. Make sure you are an administrator in the current environment where the database is to be create, and that the correct license is assigned to you. You create the database either through the Admin center as described in the linked article, of directly from the PowerApps working console, by selecting first the environment you will work in, and then navigating to Common Data Service > Entities.

image

Read the rest of this entry »

About a year ago, PowerApps evolved with the addition of Environments. This creates the necessary logical separation needed in most enterprise environments. You can think of these environments as logical containers that allow complete separation of security roles and audiences.

Now, do you have to use environments?

Well, you are using environments from the get-go. On creation you get a default environment. You can add additional environments as needed.

When do you want to use environments?

You can have separate environments for various stages in the SDLC. You can have a Development, Test, QA, UAT and Production environments, as needed.

In addition, due to scope limitation, you might need multiple environments for the same functionality. For example, let’s have a look at the scope of an environment. Each environment is related to a tenant, and a geographical region (geo). As such, if you’re running with a global deployment, you might want to use separate environments for each geo your application runs in.

Another aspect of environments is the ability to separate your data sources. Each environment has it’s own defined data sources. This is what allows you to have the separate environment for the SDLC stages.

Read the rest of this entry »

Integration is always an interesting thing. Recently I’ve been spending some time with Flow and Logic Apps, building several POCs. This got me thinking about which tool is right for which job.

We’ve had for a long time the built-in processes. They have evolved over time into a very robust tool set. The Workflow is one of the first tools in the system to create automation within Dynamics 365. A workflow can be configured to run in the background, which is the most resource efficient way from a system performance perspective, as it gets scheduled and queued, or immediate. The entire interface has remained pretty much unchanged for many versions, and should be familiar to most system customizers and administrators.

Now, Flow came into play, as the new shiny kid on the block. This had some people wondering, if Flow is the new tool in the toolbox, and it appears to do similar things to dynamics workflows, then what gives?

Oh, and for those more focused on integrations, you might have realized that Logic Apps is somewhat familiar to Flow. Now, really, what gives?

Read the rest of this entry »

A noteworthy idea with a challenging path ahead

So, with the July 2017 update dropping in at the beginning of October (I know, nomenclature issues here), among the new features we get, one items got some of us excited. That is Virtual Entities. The sales pitch was something along the lines of a new way to integrate data from external sources. That sounds beneficial, if we can finally reduce our arsenal of 3rd party tools and make our life that much easier in the process. But is it really all that it’s meant to be?

Well, let’s have a look, shall we.  Read the rest of this entry »

Continuing with the Asset Management example from THIS previous post, I had a requirement to be able to create a view showing all assets that do not have a document library location created on SharePoint. Usually, for an asset, at the minimum, a specifications sheet must be attached, including the installation notes.

This is a very simplistic approach, using a Two Options field type, set to default Yes/No to use for filtering, and a workflow to update this field when documents are added.

So, first step, add the field to the entity. I will name it Documentation (asm_documentation), and set it default to No. You don’t necessarily have to add it to any form, as it’s not relevant on the asset property.

Next, create a view based on the Active Assets default view. I named it “Active Assets with no Documentation”. Add a column to the view showing the Documentation field we just added, as well as other fields that make sense. Add a filter to only show records with the Documentation value of No.

image

Next, the workflow we create will update the value of the Documentation field to Yes if documents are attached to the record. We do this by running the workflow against the Document Location entity. Set it as a background workflow. Set the conditions as described in the screenshot below.

image

Save and Activate the workflow. Publish all customizations.

Now start loading some assets into your asset management. Add documents to some, and leave others with no documents attached. Once done, navigate to the “Active Assets with no Documentation” view. You will see a listing of assets that are missing the documentation library on SharePoint. You can then add the necessary documentation, assign the records to the proper team for documentation management or trigger a process to request the installation team to load the required documentation.

image

And with that, you can determine which records do not even have a related SharePoint document library created.

NOTE: Once you navigate to Documents on the respective record, the folder is created on SharePoint and triggers the process to update the Documentation field value to Yes. This does NOT determine if any documents are actually loaded in the respective SharePoint folder. We’ll tackle a process to determine if various types of documents are attached to a record in another post.

Enjoy!

Seems like this is a topic of interest. My previous post HERE is still one of the most popular posts, even though it was published way back in April 2014. As such, and since I was recently putting together a small POC that involved this functionality, I decided to revisit this topic.

As I was saying, recently I had the opportunity to look at an asset management solution that integrates assets from Esri with other sources for enhanced data, as well as Customer Service and Field Service. Lots of integration work, as well as some interesting challenges along the way.

It is unfortunate that Esri decided a while back to pull support for their Dynamics CRM solution. Now we can only integrate to bring data into Dynamics 365. Even the support for the old 2015 version has been retired at the beginning of June 2017. It’s even more interesting, since they seem to support SharePoint, but I digress.

The part I want to focus on, and to come back to why I mentioned the original article, is simple. It deals with leveraging Google Maps to locate assets on a visual map.

Through integration, I can get my needed details on asset type, description and location in the form of Latitude and Longitude. This being a physical asset, like a tree, bench, bus stop, or any other type of urban furniture, an address does not apply and the coordinates are tracking the exact location of the item. IoT provides additional data points on the assets, depending on the asset, but that will be a different topic one day.

So, on my asset record, I am tracking the Latitude and Longitude data fields in two fields named asm_latitude and asm_longitude. Similarly to the approach described in the referenced article, I’m using a HTML web resource to present the location.

The Google API has evolved, and you will find references stating that as of v3 a key is not required anymore. While technically that is correct, pay close attention to the licensing model. This is an asset tracking application, and, as described HERE in the Pricing and Plans section, a Premium Plan is required for Asset Tracking Use Case. Obviously, contact sales for a price. And HERE is the description on the usage limits.

But, back to the record form, the format I chose for this POC is quite simplistic. See the screenshot below.

image

The displayed map is nothing more that a Web Resource of type Webpage (HTML).

The code to make it render, based on the Latitude and Longitude coordinates in the asset form is below (remember, this goes in the web resource, in the Source of the page).

NOTE: I’m not showing any kind of error handling for simplicity. Build your own error handling to make sure no unexpected behavior is impacting the user experience.

<html><head>
<meta name="viewport" content="initial-scale=1.0, user-scalable=no">
<meta charset="utf-8">
<style>
#map {
    height: 100%;
    margin: 0;
    padding: 0;
}
</style>
<meta charset="utf-8"></head>
<body style="word-wrap: break-word;">

var point_lat = window.parent.Xrm.Page.getAttribute(“asm_latitude”).getValue();
var point_lng = window.parent.Xrm.Page.getAttribute(“asm_longitude”).getValue();

function initMap() {
    var point_location = new google.maps.LatLng(point_lat, point_lng);
   
    var map = new google.maps.Map(document.getElementById(‘map’), {
        zoom: 15,
        center: point_location
    });
   
    var marker = new google.maps.Marker({
        position: point_location,
        map: map
    });
}

https://maps.googleapis.com/maps/api/js?key=font

</body></html>

Replace the YOUR_KEY_HERE with an API key. You can obtain one from HERE.

There’s three main pieces to this code. First off, the style in the header sets-up the page format to extend all the way. You could tinker with this, but it’s best to just maximize it.

The second part is the div with an id of map. This is our anchor point on the page, and the script is looking for this.

And lastly, the script. I’m reading the values from the Latitude and Longitude fields on the form using windows.parent to reference the record form rather than the web resource the script is running in. The rest is straight out of Google’s API documentation. Strongly recommend you browse through that for more examples, as well as the description of zoom values available (cool to know).

Wham, bam, 5 minutes jam, happy demo!

I’ve recently had the opportunity to preview a new Dynamics 365 book. I wanted to take this opportunity to give a shout-out to the author, Rami Mounla. His book is available here:

9781786464170

The book walks the reader through technical concepts, starting at the basics, with no code extensions, and moving slowly into the meat of it all. It’s covering client and server side extensions, integration and security. Solution architecture and DevOps are covered, while also introducing various tools and frameworks.

Overall, it is a good overview of capabilities from a development and technical architecture perspective. While the book is described at primarily aimed at existing Dynamics resources, I think it’s also a good resource for others starting on the journey to Dynamics 365. Jumping farther from the shallow end will make you a better swimmer.

Pick up the book, read it, learn something new or just dust-off some of that existing knowledge you’ve put away for a while.

 

Book Details

Title: Microsoft Dynamics 365 Extensions Cookbook

Author: Rami Mounla

Length: 462 pages

Edition: 1

Language: English

Publisher: Packt Publishing

Publication Date: 2017-06-07

ISBN-10: B01MD2BN8U

As it stands right now, the CRM and former AX functionality has been wrapped under the Dynamics 365 umbrella. All nice, but with the changes made to the former Dynamics AX to bring it to a modern state, and a cloud model, the old style integrations have to be “adjusted”.

As announced by Microsoft, there is an integration planned, but few month into the new platform, we’re not seeing it just yet.

The roadmap site, at roadmap.dynamics.com is now publishing some details on the expected functionality. The integration is described under the heading “Prospect to cash integration of Dynamics 365 for Sales and Dynamics 365 for Operations”. As you can see, it now all follows the business functionality model familiar to the platforms.

The proposed integration leverages the somewhat recently released Common Data Service. If you still don’t know what that is, see the description HERE.

The idea of this proposed integration is to allow users to start the sales process in Dynamics 365 for Sales, and complete the order fulfillment on Dynamics 365 for Operations. This leverages both platform functionalities for their strengths. Or does it? Let’s look at the details available so far.

Accounts and Contact

For both Accounts and Contact, the plan is to sync these records from Sales to Operations.

Nagging question here being what happens if these records get updated on the Operations side, or simultaneously on both sides? No details yet.

Workaround is to have editable access only on the Sales side, and read-only on the Operations side. As such, financial or operations users must have a Plan 2 license most likely, or possibly get away with a Team Member license for light usage. For additional licensing implications see the licensing guides, as linked to in THIS earlier post.

Product Catalog

This is to be maintained in Operations, and synchronized back into Sales.

Question here is how bundles and packaged offerings for up-sell are going to be handled. An assumption is that these groupings will be created and maintained on the Sales side, which means that a user managing products must use both Sales and Operations for maintaining the product catalog. Or maybe have a team/user maintaining the products in Operations, and another team/user managing bundles and sales artifacts on the Sales side.

Again, licensing implications, as well as additional coordination between the platforms and/or users/teams.

Possibly 3rd party products might actually help here a little? There’s an opportunity.

Quotes

These are actually following the same model as in the previous versions with the integration. Quotes are fully created and managed on the Sales side.

One could argue why even synchronize them into Operations, but this is in line with the old model where you could jump to the former AX either at the Quote or the Order level. See the next paragraph on Orders.

Orders

Now it looks like the recommendation is to proceed all the way to the order generation in the Sales module, and then sync to Operations at the Order level. This should be ok for most situations.

Invoices

Invoices are to be generated and processed, as expected, on the Operation side. No issues here. They are to be synchronized back to Sales for visibility, so I would see the Invoice records as a complete read-only on the Sales side. The financials implications of changing an invoice can not really be handled on the Sales side to begin with, so there is no real reason to have these records editable on the Sales side.

Conclusion

While this puts us on the right path, somewhat, it is far from ideal for the following reasons:

Licensing implications could possibly require users to hold a more expensive license to be able to handle a complete business flow along with related needs.

This only covers a standard sales process. As soon as you step outside of this model, the amount of work involved could easily become similar to simply starting from scratch. This is just an assumption right now, we’ll see.

While leveraging the Common Data Service has it’s advantages, including the use of Flow and Power Apps, you now have data in flow in 3 places that must stay in sync. This brings up the next point.

Real-time or near-real-time possible issues. As discussed in Accounts and Contact, if you don’t lock the data on one side to be read-only, simultaneous updates to the same record, in Sales, Operations, etc. applications, could potentially result in unexpected results and overwrites. A robust solution with proper record locking on one side when edited on the other side becomes essential. The roadmap description does not hint to any of this. This is potentially the biggest problem here.

In the “Fast implementation” heading of the article, one work sticks out: templates. This begs the question: is this really going to be a production ready option? Or are we back to the “template” integration options available with the old versions, which almost all the times needed to be extended? Or maybe 3rd party solutions fill-in the gap? There’s an opportunity for 3rd party vendors here too.

Until additional details become clear, let’s look at this as what it’s described to be. A possible scenario, that is. A “template”.

Enjoy!

I’ve just done recently an install on-premise where I’ve encountered a number of issues. One of them is the following error message:

Action.Microsoft.Crm.Setup.Common.InstallWindowsSearchAction failed.

The service cannot be started, either because it is disabled or because it has no enabled devices associated with it. (Exception from HRESULT: 0x80070422)

image

The message is pretty explicit, in that a service is not started. Turns out, after a little investigation, that for some reason the SQL Server Service as well as the SQL Server Agent Service are off. This could be the result of a failed SQL update I just got previously, or some other issue. Simply setting these services to Automatic and starting them allows the installation to proceed further with no issue.

image

No, you do not have to restart the installation, just make sure the services are running properly.

Smooth sailing afterward.

Enjoy!

Microsoft Business Solutions MVP

Reviewed Book

Implementing Microsoft Dynamics 365 for Finance and Operations

Implementing Microsoft Dynamics 365 for Finance and Operations

Reviewed Book

Microsoft Dynamics 365 Extensions Cookbook

Microsoft Dynamics 365 Extensions Cookbook

Check out my Book

Microsoft Dynamics CRM 2016 Customization - Second Edition

Microsoft Dynamics CRM 2016 Customization - Second Edition

Check out my Book

Microsoft Dynamics CRM Customization Essentials

Microsoft Dynamics CRM Customization Essentials

Check out my Book

Microsoft Dynamics CRM 2011 Scripting Cookbook

Microsoft Dynamics CRM 2011 Scripting Cookbook

Reviewed Book

Microsoft Dynamics CRM 2011: Dashboards Cookbook

Microsoft Dynamics CRM 2011: Dashboards Cookbook

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 466 other followers

Follow TheCRMwiz on WordPress.com