State-based vs Interaction-based testing in Dynamics CRM

When writing Unit Tests one can focus on two different aspects of the code under test. We can call these two types of tests ‘State-based’ and ‘Interaction-based’. The first of these concentrates on the end result achieved by the code under test – what was the end result of the code being run, whereas the second focuses on how that result was achieved.

This distinction is sometimes called ‘white-box vs black-box’ testing. With white-box testing (interaction-based) we can ‘see inside’ the code as it is running, and observe the internals of the code under test. With black-box (state-based) testing, the workings of the machinery are hidden from us; all we can see are the inputs and outputs.

White Box (Interaction-based) testing Black Box (State-Based) testing
Images Designed by Freepik

When writing unit tests for CRM plugins or workflows, until recently the only available method was to use a mocking framework, which restricts you to writing white-box testing. Mocking frameworks focus on interactions – they work by creating a mocked implementation of an interface (such as the IOrganisationService) and as part of your test you then have to write code specifying what should happen when a particular call to the mocked interface took place (ie, “if a call to the RetrieveMultiple method is called with these parameters, return this entity collection”.) Your Assertions take the form of “A call should have been made to the Update method, with these parameters, so many times”. From this we can infer that the code is working correctly, but we cannot actually verify the state of the CRM database at the end of the test.

FakeXRMEasy gives us the ability to do state-based testing for Dynamics CRM code, because it not only mocks the Organisation Service, it also creates an in-memory CRM database with which you can interact during your tests. All calls to the Organisation Service made during your tests (Creates, Updated, Retrieves, Associates, etc) behave in exactly the same way that they would against a real instance of CRM and you can verify your code by testing the state of the in-memory database using these same methods.

I’ve recently started using FakeXRMEasy for all my CRM Unit testing because I believe that writing State-based tests results in tests which are more maintainable, and this is a critical factor when building up a large library of tests for a project over a period of time.

State-based tests are more maintainable for the following reasons:

You can change the implementation of your code without invalidating your tests

Imagine that we have a plugin which is called when a record is updated, and which in turn updates a number of other related records in a parent-child relationship to the record being updated. Your initial implementation might use the Update method of the Organisation Service, and your test would contain an assertion that the Update method was called with certain parameters, a certain number of times (once for each child record).

// Assert that an update was called n times on an entity whose name is "newChildEntityName"
Service.AssertWasCalled(s => s.Update(Arg<Entity>.Matches(
p => p.LogicalName == "newChildEntityName")), o => o.Repeat.Times(numberOfChildRecords));

If you were later to decide that, for performance reasons, you wanted to change your code so that it used the ExecuteMultiple request instead of executing many update statements in a row, this would invalidate your test, even though your code would still be acheiving exactly the functional result you required.

Using state-based testing instead, we don’t have to know the details of how the code works in order to write the test – ie don’t have to know whether a query is using FetchXML or a Query Expression, and we don’t have to specify that ‘this type of query executed with these parameters will return this result set’. We just set up the faked CRM context at the start of the test, and verify the state of the records we are interested in at the end, using standard CRM API calls.

Here is an example in LINQPad of the most simple test I could write to demonstrate that you really only need to know the CRM API in order to use FakeXRMEasy:

Black box tests are more readable

State based tests are more easily understood and easier to write. You don’t have to master the sometimes convoluted syntax associated with mocking frameworks, and your tests will be using the same methods as the code you are writing your production code in. If you are a CRM developer, your tests will just use the already familiar methods of the CRM API (create, update, retrieve, etc).

Readability of tests is important because it makes it easier to understand what the test is actually testing, and so easier to maintain in the long run. It also makes it easier for a developer who is new to the project to understand the code and its tests.

Black box tests are more “BDD”

Black box testing lends itself to a more BDD (behaviour driven development) approach. Tests which make assertions about the expected state of the application are easier to turn into BDD style feature files than those which test the inner workings of the application, and are easier to discuss with business analysts, testers and end users.

No need to amend production code

With most mocking frameworks, when dealing with sealed classes such as RetrieveAttributeResponse, it is not possible to set the returned AttributeMetadata property as it is read-only, so it is necessary to write a wrapper class and to amend production code to use the wrapper code as described in this article by Lucas Alexander. Whilst this approach does work, it is preferable not to have to amend production code in order to make it to be testable in this way, and it adds another source of possible confusion for developers coming across this code for the first time. With FakeXRMEasy this is not necessary. Calls to RetrieveAttributeRequest, RetrieveEntityResponse and other similar methods can be achieved without any modification to production code.

When should we carry out White Box testing?

It is still possible to perform Interaction-based testing with FakeXRMEasy and there are times when you might still want to do this. Interaction based tests are useful if you want to test that your implementation is working as expected. A couple of examples of this are:

If you have conditional logic in your code which (say) switches between using a Retrieve and a RetrieveMultiple under certain conditions, your test could assert that the correct method had been called.

If you notice a bug such that a call to the API is being called more times than expected, you could introduce an assertion to check that (say) a query is only being carried out once.

Here’s the same test from earlier, this time written as an interaction test:

This LINQPAD script contains the above tests, with both the state and interaction based assertions present.

This article by the Jordi Montaña, creator of FakeXRMEasy goes into the differences between using a normal mocking framework and FakeXRMEasy in much more depth.

I hope I’ve managed to convey some of the many advantages that I feel using FetchXRMEasy brings to unit testing in CRM.

Posted in Dynamics CRM, FakeXRMEasy, Testing | Tagged , , | Leave a comment

The case of the missing xml file

Our automated deployment of a CRM Solution file started failing recently and it was quite a journey to find out what was causing the error.

We automate the deployment of our CRM solutions using the Build/Release processes in Visual Studio Team Services (VSTS). In short the process looks like this:

  1. Using the Dynamics 365 Developer Toolkit, we extract and unpack a solution into a customisations project in Visual Studio.
  2. We check these customisations into VSTS.
  3. This triggers a build of the solution, and we use Wael Hamze’s VSTS Extensions to pack the solution back into a deployable zip file
  4. The successful build process triggers a release to CRM, using the powershell utilities from the ALM toolkit from ADXStudio. (We can’t use Wael’s solution for this because it doesn’t work with our version of CRM 2015).

The first thing we knew was that our automated release started failing in VSTS with the error:

“importjob With Id = feb7b87f-1ff9-4914-ba14-0116ca959584 Does Not Exist”

As you can see from the logs, the import gets kicked off and then fails within a couple of minutes with the ‘import job does not exist’ error.

A few searches around the topic seemed to suggest that the issue could be caused by timeouts when uploading large solution files, but this did not seem to be the case for us as we would not expect the timeouts to suddenly start occurring persistently after one fairly minor check-in. Also the deployments of this large solution normally take 5-10 minutes, whereas the error was occurring in less than 2 minutes.

The first thing we tried was to see if we could import the solution file as exported directly from the CRM front-end, using the front-end solution import process.

This worked OK, so we then tried importing the solution file that is created as part of the automated build in step 3 above.

So we copied the solution file from the server where the VSTS agent runs and tried to import this through the CRM front-end. This failed almost immediately with the following error:

So, now, we’ve got something to work with – there’s something wrong with the file that the solution packager is creating.

Switching verbose tracing on on the server to which we are deploying enabled us to find a more helpful error message in the w3wp-CRMWeb trace file:

Crm Exception: Message: The import file is invalid. XSD validation failed with the following error: ‘The element ‘EntityRelationship’ has incomplete content. List of possible elements expected: ‘EntityRelationshipType’.’. The validation failed at:…

Googling the error message took us to this very useful article which suggested that the issue might be that a relationship mentioned in the Relationships.xml file in the ‘Other’ folder in the unpacked solution, might not have a corresponding file in the Relationships folder. Sure enough, the relationships.xml file contained a reference to a newly created relationship, and the Visual Studio project contained a reference to the definition file for this relationship which had not been checked into source control:

Checking that missing file into source control fixed the problem.

Lessons learned:

1.) Those error messages are not always what they seem – sometimes the only place you’ll get really useful information is the CRM trace logs.

2.) Make sure you add any new files which are part of your customisation project into source control.

 

Posted in CI, Dynamics CRM, Uncategorized | Tagged | Leave a comment

Error while loading code module: ‘Microsoft.Crm.Reporting.RdlHelper’

When trying to compile a FetchXML-based report in Visual Studio 2015 which had been developed in an earlier version of Visual Studio, we were getting the following error:

Building the report, ReportName.rdl, for SQL Server 2008 R2, 2012 or 2014 Reporting Services.
[rsErrorLoadingCodeModule] Error while loading code module: ‘Microsoft.Crm.Reporting.RdlHelper, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’.
Details: Could not load file or assembly ‘Microsoft.Crm.Reporting.RdlHelper, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified.
[rsCompilerErrorInExpression] The Language expression for the textrun ‘Table0_Details0.Paragraphs[0].TextRuns[0]’ contains an error: [BC30456] ‘Crm’ is not a member of ‘Microsoft’.

We had already installed the CRM Report Authoring Extension for CRM 2015, but this helper dll was not being picked up by Visual Studio.

We were able to locate the dll on the SSRS Server of our CRM Development environment – in our case it was located here:

C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\bin

but it may be elsewhere for different versions of SQL or CRM.

We copied the dll to the developer PC’s and then registered it in the GAC (Global Assembly Cache) on these machines. This is done by running the gacutil.exe utility. In some cases, this might be available from a normal command prompt, but we found we had to access it by running the Visual Studio 2015 ‘Developer Command Prompt for VS2015’.

On Windows 10, you can access this from the start menu like this – You need to run the programme as an Administrator, so find the programme and then Right-Click to Run As Administrator:

(You will have to get to this in a slightly different way if you are using Windows 7, but it is still accessible from the Start button).

With the new window open, navigate to the folder where the dll is and run the command gacutil -i Microsoft.Crm.Reporting.RdlHelper.dll

If you are compiling a report in Visual Studio and it is giving the error above, you might need to point the report to this version of the RdlHelper.dll

Open the report in Visual Studio. select Report -> Report Properties from the menu. (NB This options is only available if the report is open and you have selected an element on the report designer/previewer.)

In the Report Properties dialogue box, select ‘References’.

Even if the RdlHelper is listed, it may be that this is the wrong version. Click on ‘Add’ and then browse to the newly registered dll in the GAC. This will be at a location like this:

C:\Windows\assembly\GAC_MSIL\Microsoft.Crm.Reporting.RdlHelper\7.0.0.0__31bf3856ad364e35\Microsoft.Crm.Reporting.RdlHelper.dll although the exact location will depend on which version of the helper you have installed.

Remove any pre-existing references (which are most likely to a different version of the dll).

Posted in Dynamics CRM | Tagged | Leave a comment

Using Impersonation in Dynamics CRM

While you can’t log in as another through the Dynamics CRM front  end, as a System Administrator it is possible to impersonate another user when making calls to the CRM API.

This is very useful if you want to be able to do the following:

  • Test the correct implementation of security roles.
  • Check which records can be seen by which users if sharing is being used.
  • Search for charts or views belonging to different users.

For example, if you have complex security rules involving automatic sharing of records with certain privileges, this can be very time-consuming to test through the CRM front end. Writing an integration test where a record is created by one user, and then the permissions on this record can be immediately checked for another user, is very simple using impersonation.

As another example, we recently had a support issue where a team were all using a personal view created by one of the team. A few of the team were on holiday and no-one knew who had originally created the view. It was straightforward to cycle through a list of users in a particular team and list the personal views to which they had access and who the views were created by. This information is not readily accessible through the front end, nor through the API without being able to switch users, but using impersonation it was very easy to retrieve the information needed.

Here is an example which works in LINQPad but which you can easily amend to work in a Visual Studio project. The line in which we switch caller id is highlighted.

/*
This sample shows how you can change the callerid of the organisation service proxy
allowing you to impersonate different users
*/
void Main()
{
	// Log in with system administrator account
	var orgService = ConnectToCrm(Util.GetPassword("dev org service url"));

	// Display name of currently logged in user
	WhoAmIResponse whoami = (WhoAmIResponse)orgService.Execute(new WhoAmIRequest());
	Entity u = orgService.Retrieve("systemuser",whoami.UserId, new ColumnSet("fullname"));
	Console.WriteLine("Current user is " + u["fullname"]);
	
	// Create an account and display the user who created it
	CreateAccountAndShowWhoCreatedIt(orgService);

	// Get the user details of a different user
    string username = Util.GetPassword("myusername");
	Entity user = GetUser(orgService, username);
	
	// Switch the Caller ID to that user
	Console.WriteLine("Switching caller id to " + username);
    orgService.CallerId = user.Id;

	// Create an account and display the user who created it
	CreateAccountAndShowWhoCreatedIt(orgService);
}

private void CreateAccountAndShowWhoCreatedIt(OrganizationServiceProxy orgService)
{
	Entity account = new Entity("account");
	Guid accountId = orgService.Create(account);
	account = orgService.Retrieve("account", accountId, new ColumnSet("createdby"));
	Console.WriteLine("Account record was created by " + ((EntityReference)account["createdby"]).Name);	
}

private static OrganizationServiceProxy ConnectToCrm(string uri)
{
	string UserName = Util.GetPassword("sysopsuser");
	string Password = Util.GetPassword("sysopspassword");

	AuthenticationCredentials clientCredentials = new AuthenticationCredentials();
	clientCredentials.ClientCredentials.UserName.UserName = UserName;
	clientCredentials.ClientCredentials.UserName.Password = Password;

	OrganizationServiceProxy serviceProxy = new OrganizationServiceProxy(new Uri(uri), null, clientCredentials.ClientCredentials, null);

	Console.WriteLine("Connected to " + serviceProxy.ServiceConfiguration.ServiceEndpoints.First().Value.Address.Uri);

	return serviceProxy;
}

public static Entity GetUser(OrganizationServiceProxy orgService, string domainName)
{
	QueryExpression queryUser = new QueryExpression
	{
		EntityName = "systemuser",
		ColumnSet = new ColumnSet("systemuserid", "businessunitid", "domainname"),
		Criteria = new FilterExpression
		{
			Conditions =
					{
						new ConditionExpression
						{
							AttributeName = "domainname",
							Operator = ConditionOperator.Equal,
							Values = {domainName}
						}
					}
		}
	};

	EntityCollection results = orgService.RetrieveMultiple(queryUser);
	return results.Entities[0];
}

Here is a LINQPad script of the above code.

For fairly obvious reasons, it is not possible to change the caller id to that of a user who is disabled, or to a user who has no security roles. Such a user would not be able to make any API calls, so there would be no point in doing so anyway.

Posted in Dynamics CRM, LINQPad | Tagged , | Leave a comment

Integrating LINQPad scripts into Source Control

Whilst there is no built-in source control integration in LINQPad, it is still easy to keep your LINQPad scripts under source control. This means that scripts can easily be shared amongst your team and you can take full advantage of the benefits of source control, such as versioning and safe file storage in an external repository.

As an example I will show you how we have integrated LINQPad scripts into our Visual Studio Team Services code repository – the same pattern should work equally well for other source control systems.

First of all, create a folder within your VCS which will hold your LINQPad scripts. We have a folder called LINQPad queries with sub-folders for different projects and another folder for useful samples. Once you have created that folder, you can link that folder location on your computer to the ‘My Queries’ folder in LINQPad by clicking on the ‘Set Folder…’ link…

… and then selecting this as your custom location for queries:

Now that your LINQPad queries folder is under source control, any changes you make will be flagged as Pending Changes in Visual Studio, and you can add new files to the repository using the Source Control Explorer in Visual Studio.

If your team all map their ‘My Queries’ folder to the same location, then all files in the repository become shared with the team.

You can avoid uploading passwords and other sensitive information in your scripts by using the Util.GetPassword method, as explained in this blog post.

PS If you just want to share LINQPad scripts with other users without putting them into source control, LINQPad has a feature called Instant Share which allows you to upload a file to a repository on the LINQPad server – you will then be given a URL of the uploaded file which you can share with colleagues or post on your blog, like this script which shows you how to generate a PDF in LINQPad.

Posted in LINQPad | Tagged | Leave a comment

Creating snippets for use with LINQPad

LINQPad automatically makes available any snippets which are configured in Visual Studio. For example, if you type ‘cw’ (for Console.Writeline), LINQPad will inform you that it is aware of the snippet and prompt you to press Tab to insert the snippet at the cursor.

It is also possible to create your own snippets in LINQPad, giving you quick access to blocks of code which you need regular access to.

For example, take a look at the script below. This is the bare minimum which you might need to connect to CRM from LINQPad without using the CRM Driver for LINQPad, as described in this post.

Turning the ‘ConnectToCrm’ method into a snippet for reuse in other scripts could not be easier. Select the code that you want to turn into a snippet (in this case the ‘ConnectToCrm’ method).

Press ‘F4’ to access the Query Properties and then press ‘Save as snippet…’. The Create Snippet dialogue box appears:

Make sure that you tick the ‘Code’ tickbox. Click Next and save the snippet in the suggested location with a sensible name (such as ‘ConnectToCRM’).

LINQPad will confirm that your snippet has been saved.

In a new c# program script, your snippet will be available to use:

Press tab to insert the snippet and F5 to run the script:

Your script should automatically contain the right references and run without any problems.

Posted in LINQPad | Tagged | Leave a comment

Connecting LINQPad to Dynamics CRM using a manually created connection

In the first article in this series, I demonstrated how to use the Dynamics CRM LINQPad Driver to connect LINQPad to CRM. In this follow-up I will show how to create a ‘manual’ connection, and why you might want to do this.

Create a basic ‘WhoAmI’ request in LINQPad without the CRM LINQPad driver.

Whilst the steps outlined below might seem laborious, you will only need to carry out the majority of these actions the first time you create a manual connection to CRM.

Create a new query in LINQPad and change the type to C# Program. You will see the following program template:

Replace the entire query with the following code:

 
void Main()
{
	OrganizationServiceProxy orgSvc = ConnectToCrm();
	WhoAmIResponse whoResp = (WhoAmIResponse)orgSvc.Execute(new WhoAmIRequest());
	Entity user = orgSvc.Retrieve("systemuser", whoResp.UserId, new ColumnSet(true));
	Console.WriteLine("Name of logged in user: " + user["fullname"]);
}

private static OrganizationServiceProxy ConnectToCrm()
{
	AuthenticationCredentials clientCredentials = new AuthenticationCredentials();
	clientCredentials.ClientCredentials.UserName.UserName = "PUT YOUR USERNAME HERE"; 
	clientCredentials.ClientCredentials.UserName.Password = Util.GetPassword("mypassword");
	Uri OrgServiceURL = new Uri("This should be the URL of your organisation service");
	OrganizationServiceProxy serviceProxy = new OrganizationServiceProxy(OrgServiceURL, null, clientCredentials.ClientCredentials, null);
	Console.WriteLine("Connected to " + serviceProxy.ServiceConfiguration.ServiceEndpoints.First().Value.Address.Uri); return serviceProxy;
}

NB This example uses an internet facing on-premise connection to CRM 2015. You may need to adjust the code to make a connection to your own CRM organisation.

You will notice that certain elements in the code are highlighted red – this is an indication that the script is missing an assembly reference:

The way in which you deal with this will depend on whether you are using LINQPad Premium/Developer or the Free/Pro edition.

Importing assemblies using the NuGet package manager

The Premium and Developer editions offer full NuGet integration, so you can import the missing assemblies from NuGet. To import the missing assemblies, press F4 and then ‘Add NuGet’ to launch the LINQPad NuGet package manager (or press Ctrl+Shift+P to go to it directly). Search for Dynamics CRM to show a list of relevant assemblies – I tend to use the Dynamics CRM 2015 Clean SDK Assemblies, but you may use others depending on your version of CRM, etc.

Click ‘Add to Query’ for the assembly you wish to add:

Now press ‘Close’ and ‘OK’ to return to your query

Adding assemblies to your query manually

If you do not have the Developer or Pro editions of LINQPad you can still add the required assemblies to your query. Press F4 to launch the Query Properties window. Browse to the location of the assemblies you want to add (such as your SDK Bin folder) and select the assemblies you require – for this example you will only need Microsoft.Crm.Sdk.Proxy.dll and Microsoft.Xrm.Sdk.dll.

If you want these to be added by default for new queries, press the ‘Set as default for new queries’ button.

NB It is also possible to create a ‘Libraries’ folder in the same folder as the LINQPad executable. Any dll’s added to this folder will automatically get picked up by LINQPad.

Adding ‘using’ directives to your query

Regardless of the way in which you added assemblies to your query, you now need to add ‘Using’ directives to tell the query which assembly to use. You will notice that the lines which are highlighted red now have a little blue drop-down allowing you to either add a ‘using’ statement to the query, or to use a fully qualified class name:

Select the ‘using’ option and LINQPad will add an (invisible) using directive to the query. Repeat this for any other elements which are not yet bound to a using statement.

Once you have gone through this process, there will still be one unresolved reference. If you try and run the query you will get this error:

Hit F4. You will be asked if LINQPad should automatically add the missing reference:

Press “Yes, and add missing references automatically from now on”.

NB You only need to go through the above process once. Once you make the correct assemblies available to LINQPad, these will be available in other queries and the references will be automatically resolved.

NB2 You may also get an error saying that Microsoft.IdentityModel or one of it’s components cannot be found. To install Windows Identity Foundation on your machine, follow these steps: http://blog.xrm.com/index.php/2012/10/quick-tip-enable-windows-identity-foundation-windows-8/

Run the Query

Press ‘F5’ to run the query. The first time you do this you will be prompted to enter your password.

This is because we are calling the Util.GetPassword method. When you enter your password and select ‘Save Password’, your credentials will be stored by LINQPad’s password manager. This allows you to store credentials in encrypted format so that you do not have to store them as plain text in your query – this is particularly useful if you are sharing scripts amongst a team! Subsequent uses of the script will use your saved password without prompting. You can manage your passwords through ‘File -> Password Manager’.
After you have entered your password, the request should execute and you should see the results:

Why create a manual connection to CRM?

I would always advise you to use the LINQPad driver for CRM as it offers many advantages over the manual process outlined above. However, there are certain circumstances where you would want to connect to CRM without using the driver.

  1. You might want to be able to connect to more than one instance of CRM simultaneously. You might be wanting to do a comparison between data (or metadata) between two different CRM instances. With a manual connection you can connect to more than one Organization Service and query them both at the same time.
  2. If you are wanting to use impersonation in your query – it is possible to impersonate different users when making calls to the Organisation Service. It does not seem possible to do this when using the LINQPad driver.
  3. If you have an entity in your organisation which has the same name as a relationship, you will get an error such as “Cannot compile typed context: The type ‘Microsoft.Pfe.Xrm.ClassName’ already contains a definition for ‘entityName'” when connecting to CRM. This is a known issue and the only option here is the manual CRM connection.
  4. It seems as if the connector does not work with Dynamics 365. I have not been able to confirm this myself – if this is the case, you can try the LINQPad 4/5 Driver for Dynamics CRM Web API or create a manual connection.
Posted in Dynamics CRM, LINQPad | Tagged , | Leave a comment

Connecting LINQPad to Dynamics CRM using the Dynamics CRM LINQPad Driver

LINQPad is an invaluable tool for any .Net developer, and it will greatly enhance your ability to write and support applications using Dynamics CRM. There are two ways with which to connect to an instance of Microsoft Dynamics CRM using LINQPad. This two-part article will describe these two different methods and examine the reasons why you might choose one over the other.

Connecting to CRM using the Dynamics CRM LINQPad Driver

NB The CRM LINQPad driver works with CRM 2011 – 2017. It seems as if the connector does not work with Dynamics 365. I have not been able to confirm this myself – if this is the case, you can try the LINQPad 4/5 Driver for Dynamics CRM Web API or create a manual connection as described here.

The LINQPad Driver for Dynamics CRM has been developed by Microsoft’s Premier Field Engineering (PFE) team. It allows you to create a number of predefined connections to CRM, which you can then use in your LINQPad CRM scripts.

The LINQPad driver for Dynamics CRM can be downloaded here, and the (very easy) installation instructions are to be found here.

LINQPad connection definitions persists between sessions, so, providing your credentials do not change, you can use your predefined connections in all subsequent queries.

You can write and run code which utilises the Organisation Service of the connection you are using:

You can easily switch between organisations in your script by simply choosing from the list of available connections:

To help prevent you from accidentally running code against production which should not be run in production, you can mark individual connections as Production and they will display like this:

Not only can you write and execute code using the CRM Driver for LINQPad, you can also explore the entities within the organisation you are connected to:

and explore the composition of entities in detail:

The connector creates a set of Early Bound classes when it is initially set-up and it is therefore possible to write LINQ queries directly against these entities:

and the editor gives you full Intellisense on the entity:

In the second article in this series I will show you how you can manually create a connection to CRM in LINQPad and why you might wish to do this.

Posted in Dynamics CRM, LINQPad | Tagged , | Leave a comment