General

Getting Started

Using the Entity Framework Profiler is easy. First, we need to make the application that we profile aware of the profiler. Then, just start the profiler.

Preparing an application to be profiled

Install EntityFrameworkProfiler NuGet package (or EntityFrameworkProfiler.Appender to install only the profiler's Appender):

Install-Package EntityFrameworkProfiler -IncludePrerelease

In the application startup (Application_Start in web applications, Program.Main in console applications, or the App constructor for WPF applications), make the following call:

HibernatingRhinos.Profiler.Appender.EntityFramework.EntityFrameworkProfiler.Initialize();

Profiler shortcut Keys

Beyond the standard operating systems shortcut keys, the profiler supports the following shortcut keys:

SFocus on Sessions tab header
TFocus on Statements tab header
DFocus on Details tab header
FFocus on Statistics tab header
/ Move to the next/prev tab
/ Move to next session/statement

Programmatic Access to the Profiler

One of the DLLs that comes with the profiler is HibernatingRhinos.Profiler.Integration, this DLL can give you programmatic access to the profiler object model. On NuGet it's available as EntityFrameworkProfiler.Integration package. For example, you may do so inside your unit tests to be able to assert on the behavior of your code.


    var captureProfilerOutput = new CaptureProfilerOutput(@"/path/to/profiler.exe");
    captureProfilerOutput.StartListening();       
    EntityFrameworkProfiler.Initialize();    
    // run code that uses Entity Frameowrk   
    Report report = captureProfilerOutput.StopAndReturnReport(); 
    // Assert / inspect on the report instance
  1. Create a new CaptureProfilerOuput file and pass it the path to the efprof.exe executable.
  2. Call StartListening to start the listening to your application's behavior.
  3. Execute your code
  4. Call StopAndReturnReport to get the report and assert / inspect it.

You can also use this DLL to give you programmatic access to the XML report files that are generated during the continuous integration build. This is done using the following code:


var xml = new XmlSerializer(typeof(Report));
using(var reader = File.OpenText(reportFile))
{ 
    Report result = (Report)xml.Deserialize(reader);
}

Using the profiler with Continuous Integration

The profiler supports the following command line interface, which allows you to execute the profiler as part of your continuous integration builds:

/CmdLineMode[+|-]         (short form /C)
/File:<string>            (short form /F)
/ReportFormat:{Json|Html|Xml}  (short form /R)
/Port:<int>               (short form /P)
/InputFile:<string>       (short form /I)
/Shutdown[+|-]            (short form /S)

Starting up with no options will result in the UI showing up.

Here is how we can get a report from our continuous integration build:

EFProf.exe /CmdLineMode /File:report-output /ReportFormat:Json /ReportFormat:Html # starts listening to applications
xunit.console.exe Northwind.IntegrationTests.dll
EFProf.exe /Shutdown # stop listening to applications and output the report

This approach should make it simple to integrate into your CI process. The JSON and XML output allows you to do programmatic access to the report, while the HTML version is human readable. You can generate JSON, XML and HTML reports by specifying the ReportFormat option multiple times.

One thing that you might want to be aware of, writing the report file is done in an async manner, so the shutdown command may return before writing the file is done. If you need to process the file as part of your build process, you need to wait until the first profiler instance is completed. Using PowerShell, this is done like this:

EFProf.exe /CmdLineMode /File:report-output /ReportFormat:Json /ReportFormat:Html
xunit.console.exe Northwind.IntegrationTests.dll
EFProf.exe /Shutdown
get-process EFProf | where {$_.WaitForExit() } # wait until the report export is completed

Licensing

Please note that from a licensing perspective, the CI mode is the same as the normal GUI mode. On one hand, it means that you don't need to do anything if you have the profiler already and want to run the CI build on your machine. On the other, if you want to run it on a CI machine, you would need an additional license for that.

Usage

Aggregate related sessions under scopes

Entity Framework Profiler can aggregate related sessions under the same scope. By default, in web applications, we aggregate sessions which are opened in the same web page under the same scope. But we provide an API that you can use to aggregate related session in your applicatoin using a different strategy.

You can use

HibernatingRhinos.Profiler.Appender.ProfilerIntegration.StartScope();

in order to start new scope, and dispose the returned object in order to end the scope. You can also give a name to the scope using:

ProfilerIntegration.StartScope("Task #" + taskId);

In addition, you can override the ProfilerIntegration.GetCurrentScopeName() method in order to set the current scope using your application specific details:

ProfilerIntegration.GetCurrentScopeName = () => { return "Scope ID" };

Azure Storage Channel

By default the profiler logs are sent from an application via TCP. Although sometimes your application might not be able to reach the profiler over the network. Using Azure Storage Channel feature you can configure the appender to send the logs as blobs to the Azure storage container and analyze them in Entity Framework Profiler. This way you can profile your Entity Framework queries from Azure Functions.

How it works

The idea of Azure Storage Channel is that profiler logs are uploaded as blobs to an Azure container. In Azure you can configure a subscription so on each BlobCreated event an appropriate message will be created in the Azure storage queue.

The Entity Framework Profiler will listen to those queue messages, download the blobs, analyze them and displays your queries in real time. Also you can load existing profiler logs later on.

Setup on Azure

In order to make the uploaded logs to be available for Entity Framework Profiler you need to execute the following steps:

  • create a storage account,
  • create a container,
  • create a storage queue,
  • create a subscription to enqueue messages about blobs creation.

You can define it using the Azure portal as shown here or use Azure CLI script (Bash, Powershell)

Appender configuration

When initializing the appender you need to specify AzureStorageToLogTo property and provide the connection string and container name where the blobs of profiler logs will be sent.

HibernatingRhinos.Profiler.Appender.EntityFramework.EntityFrameworkProfiler.Initialize(new EntityFrameworkAppenderConfiguration()
                    {
                        AzureStorageToLogTo = new AzureStorageConfiguration("azure-storage-connection-string", "container-name")
                    });

Connecting to Azure storage from Profiler

In order to be able to load the logs you need to connect to Azure using Options > Azure Storage Channel

Connecting to Azure storage from Profiler

Once you're connected to the Azure channel, the profiler will start showing data from the profiled application which is configured to send logs to Azure container. You can choose whether the logs should be deleted from the container after processing or they should stay there. If the container already contains some profiling data you can also load it into the profiler by providing a date range.

Configuring Your Application For Offline Profiling

It is possible to generate a snapshot file from your application to be analyzed later using this code:

HibernatingRhinos.Profiler.Appender.EntityFramework.EntityFrameworkProfiler.InitializeOfflineProfiling(filename);

This will generate a file with a snapshot of all the Entity Framework activity in the application, which you can use for later analysis by loading the file into the profiler.

Ignore / Resume profiling

You can instruct the profiler to ignore some parts of your application, and not publishing any events generated from those sections.

This can be achieved using one following ways:

using (ProfilerIntegration.IgnoreAll())
{
    // Ignore all events generated from here
}

Or

ProfilerIntegration.IgnoreAll();
// Ignore all events generated from here
ProfilerIntegration.ResumeProfiling();

You'll want to use this feature when you have a massive amount of operations that you don't need to profile. A typically scenario for this will be building a huge database as part of the test process, which can be time consuming to profile.

Production Profiling

Sometimes you want to profile your production application. It can be a website running on a remote machine or a service application. Using the production profiler feature of Entity Framework Profiler, you can connect to your production application using the Options > Remote Servers:

Remote Servers

Once you're connected to the remote application, the profiler will start showing data from the remote application.

In order to allow production profiling, your application should add the following line on the very beginning of your application:

EntityFrameworkProfiler.InitializeForProduction(9090, "FfIVXyUkze38r/b2ulve26LQ88NK5AYig+ecYzp3r88=");

The above method will make your application start listening on port 9090 for incoming connections. The second parameter we send to this method is a password which will be used to filter unauthorized access to your application. You should make it hard enough, so your application will be secured.

The traffic over the connection between the profiler to your application will use SslStream so it is encrypted and secured.

Troubleshooting

In order to diagnose issues in your production profiling environment you can set the following value:

ProfilerInfrastructure.ThrowOnConnectionErrors = true;

Common errors are missing permissions in the running service. Setting this value to true will throw an error in your application, so you can diagnose what the error is.

Star statements using API

You can star some statements using the ProfilerIntegration API, in order to mark them with a star in the profiler, by calling:

HibernatingRhinos.Profiler.Appender.ProfilerIntegration.StarStatements("Yellow");

After this call, each new statement will get the yellow star. You can stop new statements from be starred using:

ProfilerIntegration.StarStatementsClear();

Here is a usage example:

using (var db = new Entities(conStr))
{
    var post1 = db.Posts.FirstOrDefault();

    using (ProfilerIntegration.StarStatements("Blue"))
    {
        var post2 = db.Posts.FirstOrDefault();
    }

    var post3 = db.Posts.FirstOrDefault();
    
    ProfilerIntegration.StarStatements();
    var post4 = db.Posts.FirstOrDefault();
    ProfilerIntegration.StarStatementsClear();

    var post5 = db.Posts.FirstOrDefault();
}

In the above example, only post2 and post4 will be starred. post2 would get a blue star while post4 would get the default star which is yellow.

Note that StarStatements can use the following colors: Yellow, Orange, Red, Green, Blue and Gray.

What Firewall Permissions Does The Profiler Need?

While using the profiler you may be prompted by your firewall to allow the profiler to continue working.

The way the profiler works is by listening on a TCP sockets for events from Entity Framework, in order to do that, it needs to listen to a socket, an action that sometimes trigger an alert in firewalls.

Denying permission to start listening on a socket will disable the profiler's ability to do live profiling. It is strongly recommended that you would allow it.

The profiler also makes a check for updated version at each startup. In the case of a profiler error, the profiler will ask you if it is allowed to submit the error details so it can be fixed.

In addition to that, the profiler will periodically report (anonymously) which features are being used in the application, which allows us to focus on commonly used parts of the product.

Faq

By Per Seat Do You Mean I Need One License For Home And One For Work?

If you bought them, you may use them wherever you want. As long as only you are using it.

If your company bought the license, then you would require a separate license if you want this for home. Per seat means the number of people in an organization using it. Not the number of machines it is used on.

Note that there is a corollary to this, a single machine used by two developers, all of them using Linq to Sql Prof requires two licenses, one for each user of the profiler.

Common Issues

Unable to cast ProfiledConnection to SqlConnection

If you try to cast IDbConnection to SqlConnection in your code, you will get an unable to cast ProfiledConnection to SqlConnection exception when using the Entity Framework profiler. This is because that when you call EntityFrameworkProfiler.Initialize() method, we replace the default DbProviderFactory with our own ProfiledDbProviderFactory.

The solution for that is to not cast IDbConnection to SqlConnection, but to DbConnection. It's better practice to not assume in your code that you know the DbConnection type, and always try to use the base IDbConnection and DbConnection base classes.

The profiler shows the session but without any statements

If you try to run a program that exits very fast, you may see the session in the profiler without any log statements. This is because that the profiler didn't get the chance yet to log the statements because your application already exits. To solve this, you can force the profiler the get all the statements by adding the following line at the end of your code:

ProfilerInfrastructure.FlushAllMessages();

Entity Framework Profiler supports for LinqPAD

If you're using LinqPAD and wants to profile your Entity Framework usage with the Entity Framework Profiler, you can use the following snippet of code in order to achieve that:

void Main()
{
    HibernatingRhinos.Profiler.Appender.EntityFramework.EntityFrameworkProfiler.Initialize();
    Blogs.Take(50).Dump();
}

Please make sure to use the C# Program option from the language ComboBox, and that's it. You can now see the the profiling data in the Entity Framework Profiler.

Ignore Connection String

Since the Entity Framework Profiler 3.0 have the option Detect Connection String.
When this option is set on, it will add the connection string of latest connection performed, if this connection does not exist yet in the connection list, and will set it as the default connection.

Since connection string can contain sensitive information, that we don't want to expose. We give the option when HibernatingRhinos Profiler Appender is initialised to ignore the passing of the connection string.

Example of initiating code:

EntityFrameworkProfiler.Initialize(new EntityFrameworkAppenderConfiguration
{
    IgnoreConnectionStrings = false
});

Resolving A Permission Error When Trying To Save License

You might get an error about access to the license file being denied when first trying to use the Profiler. This is usually a result of Windows' locking out the Profiler because it was downloaded from the Internet.

To resolve this issue, go to the folder where you extracted the Profiler, right click on EFProf.exe and select Properties.

Next, click on the Unlock button to grant the Profiler standard application permissions and resolve the issue.

UltraMon Compatibility Issue

There is a known compatibility issue of the Entity Framework Profiler with UltraMon program on some computers. If you are running UltraMon program on your computer and you try to maximize the profiler screen on one of your monitors you may get the following exception:

An error occurred in Entity Framework Profiler which is known to be caused by using UltraMon on your computer. 
Please follow the instructions in the following URL in order to solve this issue: http://nhprof.com/Learn/Faq/UltraMon

This is because the UltraMon program try to inject some controls to the Entity Framework Profiler windows.

If you get the above message, please instruct your UltraMon program to not inject buttons to the Entity Framework Profiler windows. This can be done by following the following steps:

  1. Right click on the UltraMon icon in the System Tray and choose option.
  2. On the Compatibility tab select Configure Settings.
  3. Select the Entity Framework Profiler from the Running Applications list and press Next.
  4. Make sure to check the disable window buttons option and press Finish.
  5. Press OK to exit the UltraMon options window.

When you buy a license, is this a license for some period? Some version? Or buy one and keep getting updates for ever?

The license include all updates within a point release. If you buy a license for v1.0, you are going to get all updates for free for any v1.x version.

Alert

Avoid too many joins

The profiler has detected a query that contains a large number of joins, such as this one:

select * from Blogs blog0_
inner join Posts posts1_
    on blog0_.Id = posts1_.BlogId
inner join Comments comments2_
    on posts1_.Id = comments2_.PostId
inner join Users user3_
    on posts1_.UserId = user3_.Id
inner join UsersBlogs users4_
    on blog0_.Id = users4_.BlogId
inner join Users user5_
    on users4_.UserId = user5_.Id

Queries with too many joins might be a performance problem.

Each join requires the database to perform additional work, and the complexity and cost of the query grows rapidly with each additional join. While relational database are optimized for handling joins, it is often more efficient to perform several separate queries instead of a single query with several joins in it.

For OLTP systems, you should consider simplifying your queries or simplifying the data model. While we do not recommend avoiding joins completely, we strongly discourage queries with large numbers of joins.

Another issue to pay attention to is possible Cartesian products in queries contains joins, it is very easy to create such a thing and not notice it during development.

Column type mismatch

With an OR/M abstraction such as Entity Framework, it easy to generate queries which are not optimized. In particular, this alert is raised when we detected that the type of the entity's field, which you have been used as a parameter in the query that has this alert, is not an exact match to the table’s column type in the database.

Column type mismatch between the entity’s property type and the tables column type can cause several implicit conversion issues, which can lead to performance issues and conversion overflow issues. Most commonly you’ll see them causing issues because they prevent the database from using indexes properly.

Queries still work, and everything looks fine externally, but the cost of the query is many time what it would be otherwise.

Common examples of type mismatches are:

  • Int32 vs bigint

  • Int64 vs int

  • String vs char / varchar / nvarchar(len mismatch)

  • AnsiString vs nvarchar / nchar / varchar (len mismatch)

In order to resolve this issue, you need to make sure that your column type and entity definitions are a match in all respects. Pay attention to the common type mismatches shown above to guide you in this.

Different parameter sizes result in inefficient query plan cache usage

The profiler detected identical statements that use different sizes for the same parameters. Let us say that we issue two queries, to find users by name. (Note: I am using a syntax that will show you the size of the parameters, to demonstrate the problem).

We can do this using the following queries.

exec sp_executesql 
      N'SELECT * FROM Users WHERE Username = @username',
      N'@username nvarchar(3)',
      @username=N'bob'
      
exec sp_executesql 
      N'SELECT * FROM Users WHERE Username = @username',
      N'@username nvarchar(4)',
      @username=N'john'

This sort of code result in two query plans stored in the database query cache, because of the different parameter sizes. In fact, if we assume that the Username column has a length of 16, this single query may take up 16 places in the query cache.

Worse, if you have two parameters whose size change, such as username (length 16) and password (length 16), you may take up to 256 places in the query cache. Obviously, if you use more parameters, or if their length is longer, the number of places that a single query can take in the query cache goes up rapidly.

This can cause performance problems as the database need to keep track of more query plans (uses more memory) may need evict query plans from the cache, which would result in having to rebuild the query plan (increase server load and query time).

Entity Framework 3.5 doesn't have a solution for this problem, instead, you should consider either moving to Entity Framework 4.0 (which does fix this issue) or switch from using a Entity Framework query to use an ADO.Net query while explicitly specifying the parameter sizes.

Don’t Query from the View

This alert is raised when the profiler detect that a query was generated from the view in an MVC application. Issuing queries from the view is a bad practice for several reasons.

  • It increase the time that the connection to the database have to be open. The recommendation is to keep that open only for the duration of the action, not throughout the lifetime of the request.
  • It make it that much harder to understand what are the data requirements for a particular action is.
  • When writing views, you shouldn't be bothered with thinking about persistence, or the number of queries that you views are generating.
  • The views are often the most changeable parts in the application, and having the application issue queries from the views may result in significant changes to the way the application data access pattern between revisions.
  • Most often, queries from the views result from lazy loading, Select N+1 or similar bad practices.

We strongly recommend that you'll avoid generating queries in the view, instead, perform all your queries in the action, and provide in memory access only to the view for them to render themselves.

Ends with query (like '%...') will force the database to scan the full table

The database is very good in answering queries that look for an exact match such as this:

select * from Users where Name = 'ayende'

But it requires a lot more work when you are using a like, such as this query:

select * from Users where Name like 'ayende%'

In many cases, the database can still optimize this query, and assuming that you have an index on this field, use the index.

But when things are drastically different when you have a query that checks for contains:

select * from Users where Name like '%ayende%'

Or using ends with:

select * from Users where Name like '%ayende'

The problem is that the databases cannot use an index for this sort or query, and it is force to issue a full table scan, inspecting each of the values in the database for a match. This tends to be very inefficient process.

You should carefully consider whatever you should use this feature, and if you really need to support ends with and contains queries, you should consider using either the database' own full search indexing, or using an external full text search option, such as Lucene or Solr.

Excessive Number Of Rows Returned

The excessive number of rows returned warning is generated from the profiler when a query is returning a large number of rows. The simplest scenario is that we loaded all the rows in a large table, using something like the following code snippet:

var allPosts = blogDataContext.Posts.ToList();

This is a common mistake when you are binding to a UI component (such as a grid) that performs its own paging. This is a problem on several levels:

  • We tend to want to see only part of the data
  • We just loaded a whole lot of unnecessary data
  • We are sending more data than necessary over the network
  • We have a higher memory footprint than we should
  • In extreme cases, we may crash as a result of an out of memory exception

None of these are good, and like the discussion on unbounded result sets, this problem can be easily prevented by applying a limit at the database level to the number of rows that we want to load at any given time.

Large Number Of Individual Writes

This warning is raised when the profiler detects that you are writing a lot of data to the database. Similar to the warning about too many calls to the database, the main issue here is the number of remote calls and the time they take.

Unlike the more common Select N+1, the issue with this alert isn't with the number of selects, but with the number of write statements (delete, create, update) that are being sent to the database. There is no one single solution to this problem, but a common approach is to change the write behavior from directly using the Entity Framework to a batch or a bulk approach, such as performing the changes in a stored procedure in the database itself.

More than one session per request

Using more than one object context per request is generally a bad practice. Here is why:

  1. Each object context has its own database connection. Using more than one object context means using more than one database connection per request, resulting in additional strain on the database and slower overall performance.
  2. Typically we expect the object context to keep track of our entities. When we have a multiple object contexts, each object context is not aware of the entities that tracked by the other object context and might have to query the database again for its current state or have to issue an unnecessary update.
  3. Having more than a single object context also mean that we can't take advantage on Entity Frameworks Unit of Work and have to manually manage our entities' change tracking and we might end up with multiple instances of the same entities in the same request (which using a single object context for the whole request would prevent).
  4. Having more than one object context means that the ORM have more work to do. In most cases, this is unnecessary and should be avoided.

It's usually recommended to use one object context per request. You should investigate the Session Per Request pattern.

Typically this situation results from micromanaging the object context, meaning that we create the object context just before the query and destroy it immediately after the query or operation is executed. For example, see the following code:

public T GetEntity<T>(int id)
{
    using (var ctx = new NorthwindContext())
    {
         return ctx.Get<T>(id);
    }
}

It's strongly recommended to use a contextual object context, such as the one that can be found in the link above.

Multiple write Object Contexts in the same request

Writing to the database from more then one object context in the same request it is bad for several reasons. Here is why:

  1. Each object context uses a separate database connection. Using more than one object context means using more than one database connection per request. This can hurt overall performance, and puts more pressure on the database.
  2. Using different object context mean that we cannot take advantage of the database built-in transaction support and have to rely on System.Transactions which is significantly slower.
  3. We can't rely on the database to ensure transactionally safe view of the data, since we are using several different transactions to access the database. Note that this is the case even when using system.Transactions.
  4. When using System.Transactions it forcing you to use DTC in order to keep all sessions in the same transaction. Using DTC lead to bad performance and is more brittle than not using it.
  5. Without using System.Transactions, there is no ability to use transaction across all the session.

For example, let us consider the following code, which is using multiple object contexts to perform a single operation:

public void TransferMoney(Account from, Account to, Money amount)
{
    Dao.Withdraw(from, amount);
    Dao.Deposit(to, amount);
}

public void Withdraw(Account account, Money amount)
{
    using(var ctx = new MyBankContext())
    {
        account.MoenyAmount -= amount;
        ctx.Accounts.Attach(account);
        ctx.SaveChanges();
    }
}

public void Deposit(Account account, Money amount)
{
    using(var ctx = new MyBankContext())
    {
        account.MoenyAmount += amount;
        ctx.Accounts.Attach(account);
        ctx.SaveChanges();
    }
}

In this example, we call to the database twice, each time from different object context. Because of that, we cannot take advantage of the database native transaction support, since each operation happen in a different transaction. Our choices are to either operate essentially without transactions (in which case, money can literally disappear in the air), or participate in a distributed transaction (System.Transaction).

The solution for this is simple: Use just one object context per request.

Queries & data binding shouldn’t mix

This alert is raised whenever the profiler detects that a query has been generated as a result of a data binding operation.

The simplest from of this is binding to a linq query against the database:

DataContext = from user in ctx.Users
where user.IsActive
select user;

The problem with this is that the Win Forms / WPF data binding code was designed with the assumption that it would work against in-memory data. Therefore, setting a data source usually triggers multiple calls to interface methods to obtain the data. As a consequence, when you perform data binding against an a database query directly, the query is often evaluated a couple of times, triggering multiple round-trips to the server.

The recommendation is to bind to the result of the query:

var activeUsers = from user in ctx.Users
where user.IsActive
select user;
DataContext = activeUsers.ToList();

Another problem with data binding that the profiler will detect and warn about is lazy loading as a result of data binding operation. For example, binding to Order.Customer.Name will lead to loading the customer entity and binding to its Name property. The problem with those sort of queries is that they are likely to repeat for as many items as there are in the query, leading to Select N+1.

Even assuming that you are binding to just a single entity, lazy loading through data binding is a bad practice. You should use eager loading to load the data up front, rather than let the UI load it in the worst possible manner.

Query on unindexed column

We have detected a query that was made on an unindexed column. The problem here is that a query on an unindexed query force the database to perform a table scan. Such an act require the database to go through all the rows in the table.

In some cases, on very small tables, that is fine and the best way to find the results you want. On larger tables, however, that can be a prohibitively expensive operation which usually end up biting you only after the data set has grown to production size and then fail at 2 AM or there about.

Consider this query carefully and evaluate whatever an index needs to be created, the query needs to be modified or whatever this table is small enough that a table scan is the appropriate behavior.

Select N+1

Select N + 1 is a data access anti-pattern where the database is accessed in a suboptimal way. Take a look at this code sample, then we'll discuss what is going on. Say you want to show the user all comments from all posts so that they can delete all of the nasty comments. The naive implementation would be something like:

// SELECT * FROM Posts
var postsQuery = from post in blogDataContext.Posts
                 select post;

foreach (Post post in postsQuery)
{   
    // lazy loading of comments list causes:    
    // SELECT * FROM Comments where PostId = @p0   
    post.Comments.Load();
    foreach (Comment comment in post.Comments)
    {       
        //print comment...   
    }
}

In this example, we can see that we are loading a list of posts (the first select) and then traversing the object graph. However, we access the collection in a lazy fashion, causing Entity Frmaework to go to the database and bring the results back one row at a time. This is incredibly inefficient, and the Entity Frmaework Profiler will generate a warning whenever it encounters such a case.

The solution for this example is simple. Force an eager load of the collection using the Include method to specify what pieces of the object model we want to include in the initial query.

// SELECT * FROM Posts JOIN Comments ...
var postsQuery = (from post in blogDataContext.Posts.Include("Comments")
                 select post);

foreach (Post post in postsQuery)
{   
    // no lazy loading of comments list causes    
    foreach (Comment comment in post.Comments)
    {       
        // print comment...   
    }
}

In this case, we will get a join and only a single query to the database.

Note: this is the classical appearance of the problem. It can also surface in other scenarios, such as calling the database in a loop, or more complex object graph traversals. In those cases, it it generally much harder to see what is causing the issue.

Having said that, Entity Framework Profiler will detect those scenarios just as well, and give you the exact line in the source code that cause this SQL to be generated.

Select N+1 in the same request

Select N + 1 is a data access anti-pattern where the database is accessed in a suboptimal way. This alert in essence is a combination of the classical N+1 with the more than one session per request alerts.

Take a look at this code sample, then we'll discuss what is going on. Say you want to show the user all comments from all posts so that they can delete all of the nasty comments. The naive implementation would be something like:

var posts = Dao.GetRecentPosts(); 
foreach(var post in posts) 
{ 
       post.Comments = Dao.GetCommentsForPost(post); 
}
public IEnumerable<Post> GetRecentPosts()
{
    using(var ctx = new MyBlogContext())
    {
        return (from post in ctx.Posts
            orderby post.PublishedDate descending
            select post)
            .Take(50)
            .ToList();
    }
}
public IEnumerable<Comment> GetCommentsForPost(Post post)
{
    using(var ctx = new MyBlogContext())
    {
        return (from comment in ctx.Comments
            where comment.Post.Id = post.Id
            select comment)
            .ToList();
    }
}

In this example, we can see that we are opening a session and loading a list of posts (the first select) and then for each of the posts that we loaded, we open up a new session, including a new connection to the database, and make an additional query.

Because we are using multiple sessions in this fashion, we can't really take advantage of Entity Frameworks features that are meant to deal with this exact situation. The first step is to avoid using more than one session per request, and the next step is to follow the guidance on solving select N+1 problems for Entity Framework.

Too many database calls in the same request

One of the most expensive operations that we can do in our applications is to make a remote call. Going beyond our own process is an extremely expensive operation. Going beyond the local machine is more expensive still.

Calling the database, whether to query or to write, is a remote call, and we want to reduce the number of remote calls as much as possible. This warning is raised when the profiler notices that a single request is making an excessive number of calls to the database. This is usually an indication of a potential optimization in the way the sessions are used in the request.

There are several reasons why this can be:

  • A large number of queries as a result of a Select N + 1
  • Calling the database in a loop
  • Updating (or inserting / deleting) a large number of entities
  • A large number of (different) queries that we execute to perform our task

For the first reason, you can see the suggestions for Select N + 1.

Calling the database in a loop is generally a bug, and should be avoided. Usually you can restructure the code in such a way that you are not required to call the database in that way.

The last two issues have to be dealt with on an individual basis. Sometimes, using a different approach (such as bulk copy for batching inserts) or a specific query or stored procedure that will get all the data from the database in a single round trip.

Too Many Database Calls Per Session

One of the most expensive operations that we can do in our applications is to make a remote call. Going beyond our own process is an extremely expensive operation. Going beyond the local machine is more expensive still.

Calling the database, whether to query or to write, is a remote call, and we want to reduce the number of remote calls as much as possible. This warning is raised when the profiler notices that a single session is making an excessive number of calls to the database. This is usually an indication of a potential optimization in the way the session is used.

There are several reasons why this can be:

  • A large number of queries as a result of a Select N + 1
  • Calling the database in a loop
  • Updating (or inserting / deleting) a large number of entities
  • A large number of (different) queries that we execute to perform our task

For the first reason, you can see the suggestions for Select N + 1.

Calling the database in a loop is generally a bug, and should be avoided. Usually you can restructure the code in such a way that you are not required to call the database in that way.

The last two issues have to be dealt with on an individual basis. Sometimes, using a different approach (such as bulk copy for batching inserts) or a specific query or stored procedure that will get all the data from the database in a single round trip.

Too many expressions per where clause

With the abstraction of an OR/M such as Entity Framework, it easy to generate queries which are hard to follow and has performance penalties, since you don’t see the exact SQL that is produced at the end. This often encourages you to write code that produces a bad SQL queries.

The SQL is bad not only because of the for the readability of the SQL, but also because the work that the database’s query optimizer has to work in order to serve such a query.

In this case, we detected that you have a query with a lot of work inside where clauses. Having too much expressions inside the where statement can lead to poor performance, especially when you grow and have a big data set.

Consider the complexity of your statement and whatever you can reduce the work done by refactoring your code to reduce the amount of effort required by the database to answer your queries.

Too many nesting select statements

With the abstraction of an OR/M such as Entity Framework, it is easy to generate queries which are hard to follow and has performance penalties. Since you don’t see the exact SQL that is produced at the end, you can write code that produces a bad SQL. The SQL is bad not only because of the for the readability of the SQL, but also because the work that the Database’s query analyzer has to work in order to serve such a query.

To demonstrate such a query, look on the query below. The query have lots of nested SELECT statements which makes the SQL hard to follow and to also slower for the database to run. The query below goes on for quite a while, and it doing a lot of things, none of them very fast.

The problem with such queries is that the code that generate them looks innocent and hides its costly impact. Consider modifying the queries that raised this warning, even splitting them up to separate queries executed independently might be a faster approach.

SELECT [Project4].[Id]                             AS [Id],
       [Project4].[Name]                           AS [Name],
       [Project4].[Description]                    AS [Description],
       [Project4].[CategoryTemplateId]             AS [CategoryTemplateId],
       [Project4].[MetaKeywords]                   AS [MetaKeywords],
       [Project4].[MetaDescription]                AS [MetaDescription],
       [Project4].[MetaTitle]                      AS [MetaTitle],
       [Project4].[ParentCategoryId]               AS [ParentCategoryId],
       [Project4].[PictureId]                      AS [PictureId],
       [Project4].[PageSize]                       AS [PageSize],
       [Project4].[AllowCustomersToSelectPageSize] AS [AllowCustomersToSelectPageSize],
       [Project4].[PageSizeOptions]                AS [PageSizeOptions],
       [Project4].[PriceRanges]                    AS [PriceRanges],
       [Project4].[ShowOnHomePage]                 AS [ShowOnHomePage],
       [Project4].[IncludeInTopMenu]               AS [IncludeInTopMenu],
       [Project4].[HasDiscountsApplied]            AS [HasDiscountsApplied],
       [Project4].[SubjectToAcl]                   AS [SubjectToAcl],
       [Project4].[LimitedToStores]                AS [LimitedToStores],
       [Project4].[Published]                      AS [Published],
       [Project4].[Deleted]                        AS [Deleted],
       [Project4].[DisplayOrder]                   AS [DisplayOrder],
       [Project4].[CreatedOnUtc]                   AS [CreatedOnUtc],
       [Project4].[UpdatedOnUtc]                   AS [UpdatedOnUtc]
FROM   (SELECT [Limit1].[Id]                             AS [Id],
               [Limit1].[Name]                           AS [Name],
               [Limit1].[Description]                    AS [Description],
               [Limit1].[CategoryTemplateId]             AS [CategoryTemplateId],
               [Limit1].[MetaKeywords]                   AS [MetaKeywords],
               [Limit1].[MetaDescription]                AS [MetaDescription],
               [Limit1].[MetaTitle]                      AS [MetaTitle],
               [Limit1].[ParentCategoryId]               AS [ParentCategoryId],
               [Limit1].[PictureId]                      AS [PictureId],
               [Limit1].[PageSize]                       AS [PageSize],
               [Limit1].[AllowCustomersToSelectPageSize] AS [AllowCustomersToSelectPageSize],
               [Limit1].[PageSizeOptions]                AS [PageSizeOptions],
               [Limit1].[PriceRanges]                    AS [PriceRanges],
               [Limit1].[ShowOnHomePage]                 AS [ShowOnHomePage],
               [Limit1].[IncludeInTopMenu]               AS [IncludeInTopMenu],
               [Limit1].[HasDiscountsApplied]            AS [HasDiscountsApplied],
               [Limit1].[SubjectToAcl]                   AS [SubjectToAcl],
               [Limit1].[LimitedToStores]                AS [LimitedToStores],
               [Limit1].[Published]                      AS [Published],
               [Limit1].[Deleted]                        AS [Deleted],
               [Limit1].[DisplayOrder]                   AS [DisplayOrder],
               [Limit1].[CreatedOnUtc]                   AS [CreatedOnUtc],
               [Limit1].[UpdatedOnUtc]                   AS [UpdatedOnUtc]
        FROM   (SELECT [Distinct1].[Id] AS [Id]
FROM   (SELECT [Extent1].[Id]               AS [Id1],
                [Extent1].[ParentCategoryId] AS [ParentCategoryId],
                [Extent1].[LimitedToStores]  AS [LimitedToStores]
        FROM   [dbo].[Category] AS [Extent1]
                LEFT OUTER JOIN [dbo].[AclRecord] AS [Extent2]
                    ON ([Extent1].[Id] = [Extent2].[EntityId])
                    AND (N'Category' = [Extent2].[EntityName])
        WHERE  ([Extent1].[Published] = 1)
                AND ([Extent1].[Deleted] <> 1)
                AND (([Extent1].[SubjectToAcl] <> 1)
                        OR (([Extent2].[CustomerRoleId] IN (3))
                            AND ([Extent2].[CustomerRoleId] IS NOT NULL)))) AS [Filter1]
        LEFT OUTER JOIN [dbo].[StoreMapping] AS [Extent3]
            ON ([Filter1].[Id1] = [Extent3].[EntityId])
            AND (N'Category' = [Extent3].[EntityName])
WHERE  ([Filter1].[ParentCategoryId] = 7 /* @p__linq__0 */)
        AND (([Filter1].[LimitedToStores] <> 1)
                OR (1 /* @p__linq__1 */ = [Extent3].[StoreId]))) AS [Distinct1]) AS [Project2]
OUTER APPLY (SELECT TOP (1) [Filter3].[Id2]                            AS [Id],
                    [Filter3].[Name]                           AS [Name],
                    [Filter3].[Description]                    AS [Description],
                    [Filter3].[CategoryTemplateId]             AS [CategoryTemplateId],
                    [Filter3].[MetaKeywords]                   AS [MetaKeywords],
                    [Filter3].[MetaDescription]                AS [MetaDescription],
                    [Filter3].[MetaTitle]                      AS [MetaTitle],
                    [Filter3].[ParentCategoryId]               AS [ParentCategoryId],
                    [Filter3].[PictureId]                      AS [PictureId],
                    [Filter3].[PageSize]                       AS [PageSize],
                    [Filter3].[AllowCustomersToSelectPageSize] AS [AllowCustomersToSelectPageSize],
                    [Filter3].[PageSizeOptions]                AS [PageSizeOptions],
                    [Filter3].[PriceRanges]                    AS [PriceRanges],
                    [Filter3].[ShowOnHomePage]                 AS [ShowOnHomePage],
                    [Filter3].[IncludeInTopMenu]               AS [IncludeInTopMenu],
                    [Filter3].[HasDiscountsApplied]            AS [HasDiscountsApplied],
                    [Filter3].[SubjectToAcl]                   AS [SubjectToAcl],
                    [Filter3].[LimitedToStores]                AS [LimitedToStores],
                    [Filter3].[Published]                      AS [Published],
                    [Filter3].[Deleted]                        AS [Deleted],
                    [Filter3].[DisplayOrder]                   AS [DisplayOrder],
                    [Filter3].[CreatedOnUtc]                   AS [CreatedOnUtc],
                    [Filter3].[UpdatedOnUtc]                   AS [UpdatedOnUtc]
        FROM   (SELECT [Extent4].[Id]                             AS [Id2],
                    [Extent4].[Name]                           AS [Name],
                    [Extent4].[Description]                    AS [Description],
                    [Extent4].[CategoryTemplateId]             AS [CategoryTemplateId],
                    [Extent4].[MetaKeywords]                   AS [MetaKeywords],
                    [Extent4].[MetaDescription]                AS [MetaDescription],
                    [Extent4].[MetaTitle]                      AS [MetaTitle],
                    [Extent4].[ParentCategoryId]               AS [ParentCategoryId],
                    [Extent4].[PictureId]                      AS [PictureId],
                    [Extent4].[PageSize]                       AS [PageSize],
                    [Extent4].[AllowCustomersToSelectPageSize] AS [AllowCustomersToSelectPageSize],
                    [Extent4].[PageSizeOptions]                AS [PageSizeOptions],
                    [Extent4].[PriceRanges]                    AS [PriceRanges],
                    [Extent4].[ShowOnHomePage]                 AS [ShowOnHomePage],
                    [Extent4].[IncludeInTopMenu]               AS [IncludeInTopMenu],
                    [Extent4].[HasDiscountsApplied]            AS [HasDiscountsApplied],
                    [Extent4].[SubjectToAcl]                   AS [SubjectToAcl],
                    [Extent4].[LimitedToStores]                AS [LimitedToStores],
                    [Extent4].[Published]                      AS [Published],
                    [Extent4].[Deleted]                        AS [Deleted],
                    [Extent4].[DisplayOrder]                   AS [DisplayOrder],
                    [Extent4].[CreatedOnUtc]                   AS [CreatedOnUtc],
                    [Extent4].[UpdatedOnUtc]                   AS [UpdatedOnUtc]
                FROM   [dbo].[Category] AS [Extent4]
                    LEFT OUTER JOIN [dbo].[AclRecord] AS [Extent5]
                        ON ([Extent4].[Id] = [Extent5].[EntityId])
                            AND (N'Category' = [Extent5].[EntityName])
                WHERE  ([Extent4].[Published] = 1)
                    AND ([Extent4].[Deleted] <> 1)
                    AND (([Extent4].[SubjectToAcl] <> 1)
                            OR (([Extent5].[CustomerRoleId] IN (3))
                                AND ([Extent5].[CustomerRoleId] IS NOT )))NULL) AS [Filter3]
            LEFT OUTER JOIN [dbo].[StoreMapping] AS [Extent6]
                ON ([Filter3].[Id2] = [Extent6].[EntityId])
                    AND (N'Category' = [Extent6].[EntityName])
        WHERE  ([Filter3].[ParentCategoryId] = 7 /* @p__linq__0 */)
            AND (([Filter3].[LimitedToStores] <> 1)
                    OR (1 /* @p__linq__1 */ = [Extent6].[StoreId]))
            AND ([Project2].[Id] = [Filter3].[Id2])) AS [Limit1]) AS [Project4]
ORDER  BY [Project4].[DisplayOrder] ASC

Too many tables in selected statement

With the abstraction of an OR/M such as Entity Framework, it is easy to generate queries which are not optimized.

In this case, we detected that you query over a large number of tables, which may lead to poor performance, and run too slow especially once you grow to a large data set. The more tables that are in the query, the more work the database has to do, whatever this is via subselect, join or nested queries.

Consider optimizing the query to use less tables, even splitting it up to multiple separate queries if that proves to be more performant.

Example query with multiple tables:

select blog0_.Id as Id15_0_,
  posts1_.Id as Id20_1_,
  comments2_.Id as Id19_2_,
  blog0_.Subtitle as Subtitle15_0_,
  blog0_.AllowsComments as AllowsCo4_15_0_,
  blog0_.CreatedAt as CreatedAt15_0_,
  posts1_.Title as Title20_1_,
  posts1_.BlogId as BlogId20_1_,
  comments2_.PostId as PostId1__,
  comments2_.Id as Id1__,
  user3_.Password as Password22_3_,
  user3_.Username as Username22_3_,
  user3_.Bio as Bio22_3_,
  user5_.Password as Password22_4_,
  user5_.Username as Username22_4_,
  user5_.Bio as Bio22_4_,
  users4_.BlogId as BlogId2__,
  users4_.UserId as UserId2__
from Blogs blog0_
  inner join Posts posts1_
   on blog0_.Id = posts1_.BlogId
  inner join Comments comments2_
   on posts1_.Id = comments2_.PostId
  inner join Users user3_
   on posts1_.UserId = user3_.Id
  inner join UsersBlogs users4_
   on blog0_.Id = users4_.BlogId
  inner join Users user5_
on users4_.UserId = user5_.Id

Too many where clauses in statement

With the abstraction of an OR/M such as Entity Framework, it easy to create code which is hard to follow and has performance penalties, since you don’t see the exact SQL that is produced at the end. This often encourages you to write code that produces a bad SQL. The SQL is bad not only because of the for the readability of the SQL, but also because the work that the Database’s query analyzer has to work in order to serve such a query.

In this case, we detected that you have a query with a lot of Where statements. Having too much Where statements can lead to poor performance, especially when you grow and have a big data set.

We suggest that you keep the where clauses as few as possible.

Unbounded result set

An unbounded result set is where a query is performed and does not explicitly limit the number of returned results from a query. Usually, this means that the application assumes that a query will always return only a few records. That works well in development and in testing, but it is a time bomb waiting to explode in production.

The query may suddenly start returning thousands upon thousands of rows, and in some cases, it may return millions of rows. This leads to more load on the database server, the application server, and the network. In many cases, it can grind the entire system to a halt, usually ending with the application servers crashing with out of memory errors.

Here is one example of a query that will trigger the unbounded result set warning:

var query = from post in blogDataContext.Posts
            where post.Category == "Performance"
            select post;

If the performance category has many posts, we are going to load all of them, which is probably not what was intended. This can be fixed fairly easily by using pagination by utilizing the Take() method:

var query = (from post in blogDataContext.Posts            
            where post.Category == "Performance"            
            select post)
            .Take(15);

Now we are assured that we only need to handle a predictable, small result set, and if we need to work with all of them, we can page through the records as needed. Paging is implemented using the Skip() method, which instruct Entity Framework to skip (at the database level) N number of records before taking the next page.

But there is another common occurrence of the unbounded result set problem from directly traversing the object graph, as in the following example:

var post = postRepository.Get(id);
foreach (var comment in post.Comments)
{
    // do something interesting with the comment
}

Here, again, we are loading the entire set without regard for how big the result set may be. Entity Framework does not provides a good way of paging through a collection when traversing the object graph. It is recommended that you would issue a separate and explicit query for the contents of the collection, which will allow you to page through that collection without loading too much data into memory.

Using A Single Object Context In Multiple Threads Is Likely A Bug

Entity Framework Profiler has detected a data context that is used in a different thread than the one it was opened on.

Entity Framework object contexts are not thread safe, and attempting to use them in multiple threads requires careful synchronization. It is generally better to consider an object context only useful within the thread that created it.

There are valid scenarios for cross thread object context usage (loading data on a background thread, for example), but usually this alert indicate a problem in serializing access to the data context.

If you are using the object context across multiple threads intentionally and are certain that you are correctly serializing access to it, then you can safely ignore this alert.

You can also disable this alert completely in the options dialog.

Error

Error Was Detected

Whenever Entity Framework is running into an error (transaction deadlock, optimistic concurrency exception, etc), the Entity Framework Profiler will detect that and show you the full error (including the exception message and the stack trace).

SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.

Most of the times, this error is caused by using DateTime.MinValue or DateTime.MaxValue. Make sure that your datetime value is in the supported range of Sql Server.