Dynamics CRM 2011批量更新
Running Dynamics CRM 2011部署3.需要定期更新数百万个客户记录(增量更新)。使用标准更新(一个接一个)需要几个星期。此外,我们不想直接触摸数据库,因为它可能会在未来破坏东西。Dynamics CRM 2011批量更新
我们可以使用Dynamics CRM 2011 webservice/REST API中是否有批量更新方法? (WhatWhereHow)
是的,没有,主要是没有。有人可以纠正我,如果我错了,在这种情况下,我会很乐意编辑/删除我的答案,但一次只能完成一次。它甚至不尝试处理基于集合的插入/更新/删除。因此,除非您直接指导数据库操作,否则需要数周时间。
webservice does allow for "bulk" inserts/deletes/updates,但我把“bulk”放在引号中,因为它所做的一切都是建立一个异步过程,它执行所有相关的数据操作 - yep - 一次一个。有一部分SDK解决了这种数据管理(链接)。而要以这种方式更新记录,您必须首先承受选择所有要更新的数据的开销,然后创建一个包含数据的xml文件,最后更新数据(请记住:一次一行)。因此,循环访问数据并为每个人发出Update
请求实际上会更有效率。
(我会注意到,我们的组织没有经历就直接DB访问处理SDK没有什么任何问题难忘,也没看到我个人的互联网读数暗示别人有什么。)
编辑:
见iFirefly的answer低于其他一些优秀的方法来解决这个问题。
不知道这将如何与数百万记录,但您可以选择您的记录,然后单击功能区中的编辑按钮。这将弹出“编辑多个记录”对话框。您所做的任何更改将应用于您的所有记录。
我意识到这是帖子已超过2岁,但我可以添加到它,以防其他人阅读它,并有类似的需求。
彼得马吉德的答案是针对的,因为CRM过程一次请求一条记录。没有批量编辑按照您寻找的方式工作。如果您需要/希望获得Microsoft支持,我鼓励您不要直接联系数据库。
如果您正在查看数百万条记录的定期更新,您有几个选项。考虑使用Scribe或使用CRM SDK开发您自己的定制导入实用程序或脚本。
抄写员可能会成为您的最佳选择,因为它对数据导入具有成本效益,并允许您轻松更新和插入同一文件。
如果你编写你自己的基于.Net/SDK的实用程序,我会建议使它成为多线程的,并以编程方式在内存或磁盘上拆分输入文件,并让每个线程都使用自己的数据子集 - 当然,如果执行顺序不必按照输入文件的内容按时间顺序排列。如果您可以通过多个线程分割和征服输入文件,则可以大幅减少总体执行时间。 此外,如果您的企业策略允许您访问其中一个CRM服务器,并且您可以将代码直接放在服务器上并从那里执行 - 您可以消除运行代码的工作站与CRM网络之间的网络延迟服务。
最后但并非最不重要的一点,如果大量的导入数据来自另一个系统,您可以编写一个CRM插件在您的特定实体的CRM中的Retrieve and RetrieveMultiple消息(事件)上运行,以编程方式检索来自其他系统的期望数据(如果其他系统不可用 - 只需在CRM中使用缓存副本),并使CRM保持最新实时或“最后缓存”的基础上。这当然是更多的编码工作,但它可能消除了每隔几周就要运行一次大型同步作业的需要。
我意识到这是一个古老的问题,但它在“CRM大量更新”中出现高位,因此需要在此提及Update Rollup 12 feature ExecuteMultiple - 它不会解决您的问题(大量),因为iFirefly和Peter指向CRM一次只做一件事。它所做的就是将所有请求打包到一个信封中,让CRM处理每次更新的执行,并减少应用程序和服务器之间的往返次数,前提是最终会为每条记录发出Update
请求。
我为Dynamics CRM 2011开发了一个非常大型的数据迁移项目。我们需要在周末加载大约300万条记录。我最终构建了一个控制台应用程序(单线程),并在多台机器上运行多个实例。每个控制台应用程序都有一个id(1,2等),并负责根据与应用程序ID相匹配的唯一SQL WHERE子句加载数据段。
你可以对更新做同样的事情。每个实例都可以查询要更新的记录子集,并可以通过SDK执行更新。由于我们在一个周末载入了数百万条记录,我认为您可以在几个小时内执行数百万次更新(如果相对较小)。
微软PFE团队Dynamics CRM中写道 新Another CRM SDK library是利用并行 来散装保证线程安全执行请求。
你可以试试:并行执行请求 我很想知道,如果它的工作原理,并扩展到数百万条记录。
BulkUpdate API适用于我;它比一次更新记录快10倍。以下是执行批量更新的代码片段:
public override ExecuteMultipleResponse BulkUpdate(List<Entity> entities)
{
ExecuteMultipleRequest request = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
for (int i = 0; i < entities.Count; i++)
{
request.Requests.Add(new UpdateRequest() { Target = entities[i] });
}
return (ExecuteMultipleResponse) ServiceContext.Execute(request);
}
CRM没有实现更新批量数据的方式;有三种方法可以提高批量更新操作性能,但在内部它们无法改变CRM更新逐一记录的事实。 基本思路是:
- 减少浪费,以CRM服务器进行通信
- 使用并行同时
- 确保更新过程中不会触发任何工作流/插件做多操作的时间。否则,你可能永远也看不到过程结束...
3种方式来提高批量操作的性能:
- 汇总12之后有一个ExecuteMultipleRequest功能,它允许你发送多达1000个请求一次。这意味着您可以节省一些时间从发送1000个请求到CRM Web服务,但是,这些请求会一个接一个地处理。所以如果你的CRM服务器配置的很好,这种方法很可能不会有太大的帮助。
- 您可以使用OrganizationServiceContext实例进行批量更新。 OrganizationServiceContext实现工作单元模式,因此您可以执行多个更新并在一次调用中将这些操作传输到服务器。与ExecuteMultipleRequest相比,它对请求数量没有限制,但如果在更新期间遇到故障,它将回滚所有更改。
- 使用多线程或多任务。无论哪种方式都会提高速度,但它们很可能会产生一些连接失败或SQL错误,因此您需要在代码中添加一些重试逻辑。
这是一个相当古老的问题,但没有人提到更新/创建CRM 201X中的大量记录的禁食方式(但也是最具挑战性的) - 使用内置导入功能,这是完全可行的CRM SDK。有一个完美的MSDN文章: https://msdn.microsoft.com/en-us/library/gg328321(v=crm.5).aspx。总之,你必须:
1)构建包含要导入数据的Excel文件(简单地从CRM 201X导出一些数据,并检查结构的样子,记得第3列是隐藏的)
2)创建导入映射实体(指定您创建的文件)
3)创建如有必要
4)创建和导入实体的importfile列映射,提供适当的映射
5)使用数据解析ParseImportRequest
6)使用TransformImportRequest
7使用ImportRecordsImportRequest
这种变换分析数据)导入数据是针对2011年CRM的步骤,现在在2017年,我们有更多的版本和它们之间存在细微的差别。检查MSDN和SDK中提供的示例: https://msdn.microsoft.com/en-us/library/hh547396(v=crm.5).aspx
当然,第1点将是最困难的部分,因为您必须构建完全对应于CRM预期的XML或docx文件,但我是假设你是从外部应用程序执行它,所以你可以使用一些很棒的.NET库,这会让事情变得更简单。
当涉及到更新/创建记录时,即使您采用并行性和批量更新请求,我从来没有看到任何比标准CRM导入更快的东西。
如果出现错误的MSDN网站,我在这里张贴从以上链接的例子展示如何将数据导入到CRM编程:
using System;
using System.ServiceModel;
using System.Collections.Generic;
using System.Linq;
// These namespaces are found in the Microsoft.Xrm.Sdk.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;
// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;
namespace Microsoft.Crm.Sdk.Samples
{
/// <summary>
/// This sample shows how to define a complex mapping for importing and then use the
/// Microsoft Dynamics CRM 2011 API to bulk import records with that mapping.
/// </summary>
public class ImportWithCreate
{
#region Class Level Members
private OrganizationServiceProxy _serviceProxy;
private DateTime _executionDate;
#endregion
/// <summary>
/// This method first connects to the organization service. Afterwards,
/// auditing is enabled on the organization, account entity, and a couple
/// of attributes.
/// </summary>
/// <param name="serverConfig">Contains server connection information.</param>
/// <param name="promptforDelete">When True, the user will be prompted to delete all
/// created entities.</param>
public void Run(ServerConnection.Configuration serverConfig, bool promptforDelete)
{
using (_serviceProxy = ServerConnection.GetOrganizationProxy(serverConfig))
{
// This statement is required to enable early bound type support.
_serviceProxy.EnableProxyTypes();
// Log the start time to ensure deletion of records created during execution.
_executionDate = DateTime.Today;
ImportRecords();
DeleteRequiredRecords(promptforDelete);
}
}
/// <summary>
/// Imports records to Microsoft Dynamics CRM from the specified .csv file.
/// </summary>
public void ImportRecords()
{
// Create an import map.
ImportMap importMap = new ImportMap()
{
Name = "Import Map " + DateTime.Now.Ticks.ToString(),
Source = "Import Accounts.csv",
Description = "Description of data being imported",
EntitiesPerFile =
new OptionSetValue((int)ImportMapEntitiesPerFile.SingleEntityPerFile),
EntityState = EntityState.Created
};
Guid importMapId = _serviceProxy.Create(importMap);
// Create column mappings.
#region Column One Mappings
// Create a column mapping for a 'text' type field.
ColumnMapping colMapping1 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_name",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "name",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping.
Guid colMappingId1 = _serviceProxy.Create(colMapping1);
#endregion
#region Column Two Mappings
// Create a column mapping for a 'lookup' type field.
ColumnMapping colMapping2 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_parent",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "parentaccountid",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process),
};
// Create the mapping.
Guid colMappingId2 = _serviceProxy.Create(colMapping2);
// Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.
// One lookupmapping will be for the parent account, and the other for the current record.
// This lookupmapping is important because without it the current record
// cannot be used as the parent of another record.
// Create a lookup mapping to the parent account.
LookUpMapping parentLookupMapping = new LookUpMapping()
{
// Relate this mapping with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for an account entity by its name attribute.
LookUpEntityName = Account.EntityLogicalName,
LookUpAttributeName = "name",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.System)
};
// Create the lookup mapping.
Guid parentLookupMappingId = _serviceProxy.Create(parentLookupMapping);
// Create a lookup on the current record's "src_name" so that this record can
// be used as the parent account for another record being imported.
// Without this lookup, no record using this account as its parent will be imported.
LookUpMapping currentLookUpMapping = new LookUpMapping()
{
// Relate this lookup with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for the current record by its src_name attribute.
LookUpAttributeName = "src_name",
LookUpEntityName = "Account_1",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.Source)
};
// Create the lookup mapping
Guid currentLookupMappingId = _serviceProxy.Create(currentLookUpMapping);
#endregion
#region Column Three Mappings
// Create a column mapping for a 'picklist' type field
ColumnMapping colMapping3 = new ColumnMapping()
{
// Set source properties
SourceAttributeName = "src_addresstype",
SourceEntityName = "Account_1",
// Set target properties
TargetAttributeName = "address1_addresstypecode",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with its parent data map
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping
Guid colMappingId3 = _serviceProxy.Create(colMapping3);
// Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
PickListMapping pickListMapping1 = new PickListMapping()
{
SourceValue = "bill",
TargetValue = 1,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId1 = _serviceProxy.Create(pickListMapping1);
// Need a picklist mapping for every address type code expected
PickListMapping pickListMapping2 = new PickListMapping()
{
SourceValue = "ship",
TargetValue = 2,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId2 = _serviceProxy.Create(pickListMapping2);
#endregion
// Create Import
Import import = new Import()
{
// IsImport is obsolete; use ModeCode to declare Create or Update.
ModeCode = new OptionSetValue((int)ImportModeCode.Create),
Name = "Importing data"
};
Guid importId = _serviceProxy.Create(import);
// Create Import File.
ImportFile importFile = new ImportFile()
{
Content = BulkImportHelper.ReadCsvFile("Import Accounts.csv"), // Read contents from disk.
Name = "Account record import",
IsFirstRowHeader = true,
ImportMapId = new EntityReference(ImportMap.EntityLogicalName, importMapId),
UseSystemMap = false,
Source = "Import Accounts.csv",
SourceEntityName = "Account_1",
TargetEntityName = Account.EntityLogicalName,
ImportId = new EntityReference(Import.EntityLogicalName, importId),
EnableDuplicateDetection = false,
FieldDelimiterCode =
new OptionSetValue((int)ImportFileFieldDelimiterCode.Comma),
DataDelimiterCode =
new OptionSetValue((int)ImportFileDataDelimiterCode.DoubleQuote),
ProcessCode =
new OptionSetValue((int)ImportFileProcessCode.Process)
};
// Get the current user to set as record owner.
WhoAmIRequest systemUserRequest = new WhoAmIRequest();
WhoAmIResponse systemUserResponse =
(WhoAmIResponse)_serviceProxy.Execute(systemUserRequest);
// Set the owner ID.
importFile.RecordsOwnerId =
new EntityReference(SystemUser.EntityLogicalName, systemUserResponse.UserId);
Guid importFileId = _serviceProxy.Create(importFile);
// Retrieve the header columns used in the import file.
GetHeaderColumnsImportFileRequest headerColumnsRequest = new GetHeaderColumnsImportFileRequest()
{
ImportFileId = importFileId
};
GetHeaderColumnsImportFileResponse headerColumnsResponse =
(GetHeaderColumnsImportFileResponse)_serviceProxy.Execute(headerColumnsRequest);
// Output the header columns.
int columnNum = 1;
foreach (string headerName in headerColumnsResponse.Columns)
{
Console.WriteLine("Column[" + columnNum.ToString() + "] = " + headerName);
columnNum++;
}
// Parse the import file.
ParseImportRequest parseImportRequest = new ParseImportRequest()
{
ImportId = importId
};
ParseImportResponse parseImportResponse =
(ParseImportResponse)_serviceProxy.Execute(parseImportRequest);
Console.WriteLine("Waiting for Parse async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, parseImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Retrieve the first two distinct values for column 1 from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
GetDistinctValuesImportFileRequest distinctValuesRequest = new GetDistinctValuesImportFileRequest()
{
columnNumber = 1,
ImportFileId = importFileId,
pageNumber = 1,
recordsPerPage = 2,
};
GetDistinctValuesImportFileResponse distinctValuesResponse =
(GetDistinctValuesImportFileResponse)_serviceProxy.Execute(distinctValuesRequest);
// Output the distinct values. In this case: (column 1, row 1) and (column 1, row 2).
int cellNum = 1;
foreach (string cellValue in distinctValuesResponse.Values)
{
Console.WriteLine("(1, " + cellNum.ToString() + "): " + cellValue);
Console.WriteLine(cellValue);
cellNum++;
}
// Retrieve data from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
RetrieveParsedDataImportFileRequest parsedDataRequest = new RetrieveParsedDataImportFileRequest()
{
ImportFileId = importFileId,
PagingInfo = new PagingInfo()
{
// Specify the number of entity instances returned per page.
Count = 2,
// Specify the number of pages returned from the query.
PageNumber = 1,
// Specify a total number of entity instances returned.
PagingCookie = "1"
}
};
RetrieveParsedDataImportFileResponse parsedDataResponse =
(RetrieveParsedDataImportFileResponse)_serviceProxy.Execute(parsedDataRequest);
// Output the first two rows retrieved.
int rowCount = 1;
foreach (string[] rows in parsedDataResponse.Values)
{
int colCount = 1;
foreach (string column in rows)
{
Console.WriteLine("(" + rowCount.ToString() + "," + colCount.ToString() + ") = " + column);
colCount++;
}
rowCount++;
}
// Transform the import
TransformImportRequest transformImportRequest = new TransformImportRequest()
{
ImportId = importId
};
TransformImportResponse transformImportResponse =
(TransformImportResponse)_serviceProxy.Execute(transformImportRequest);
Console.WriteLine("Waiting for Transform async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, transformImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Upload the records.
ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
{
ImportId = importId
};
ImportRecordsImportResponse importResponse =
(ImportRecordsImportResponse)_serviceProxy.Execute(importRequest);
Console.WriteLine("Waiting for ImportRecords async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, importResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
}
/// <summary>
/// Deletes any entity records that were created for this sample.
/// <param name="prompt">Indicates whether to prompt the user
/// to delete the records created in this sample.</param>
/// </summary>
public void DeleteRequiredRecords(bool prompt)
{
bool toBeDeleted = true;
if (prompt)
{
// Ask the user if the created entities should be deleted.
Console.Write("\nDo you want these entity records deleted? (y/n) [y]: ");
String answer = Console.ReadLine();
if (answer.StartsWith("y") ||
answer.StartsWith("Y") ||
answer == String.Empty)
{
toBeDeleted = true;
}
else
{
toBeDeleted = false;
}
}
if (toBeDeleted)
{
// Retrieve all account records created in this sample.
QueryExpression query = new QueryExpression()
{
EntityName = Account.EntityLogicalName,
Criteria = new FilterExpression()
{
Conditions =
{
new ConditionExpression("createdon", ConditionOperator.OnOrAfter, _executionDate),
}
},
ColumnSet = new ColumnSet(false)
};
var accountsCreated = _serviceProxy.RetrieveMultiple(query).Entities;
// Delete all records created in this sample.
foreach (var account in accountsCreated)
{
_serviceProxy.Delete(Account.EntityLogicalName, account.Id);
}
Console.WriteLine("Entity record(s) have been deleted.");
}
}
#region Main method
/// <summary>
/// Standard Main() method used by most SDK samples.
/// </summary>
/// <param name="args"></param>
static public void Main(string[] args)
{
try
{
// Obtain the target organization's web address and client logon
// credentials from the user.
ServerConnection serverConnect = new ServerConnection();
ServerConnection.Configuration config = serverConnect.GetServerConfiguration();
var app = new ImportWithCreate();
app.Run(config, true);
}
catch (FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Timestamp: {0}", ex.Detail.Timestamp);
Console.WriteLine("Code: {0}", ex.Detail.ErrorCode);
Console.WriteLine("Message: {0}", ex.Detail.Message);
Console.WriteLine("Trace: {0}", ex.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == ex.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
catch (System.TimeoutException ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Message: {0}", ex.Message);
Console.WriteLine("Stack Trace: {0}", ex.StackTrace);
Console.WriteLine("Inner Fault: {0}",
null == ex.InnerException.Message ? "No Inner Fault" : ex.InnerException.Message);
}
catch (System.Exception ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine(ex.Message);
// Display the details of the inner exception.
if (ex.InnerException != null)
{
Console.WriteLine(ex.InnerException.Message);
FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> fe = ex.InnerException
as FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault>;
if (fe != null)
{
Console.WriteLine("Timestamp: {0}", fe.Detail.Timestamp);
Console.WriteLine("Code: {0}", fe.Detail.ErrorCode);
Console.WriteLine("Message: {0}", fe.Detail.Message);
Console.WriteLine("Trace: {0}", fe.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == fe.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
}
}
// Additional exceptions to catch: SecurityTokenValidationException, ExpiredSecurityTokenException,
// SecurityAccessDeniedException, MessageSecurityException, and SecurityNegotiationException.
finally
{
Console.WriteLine("Press <Enter> to exit.");
Console.ReadLine();
}
}
#endregion Main method
}
}
散装明显的例子创建或更新MS下面给出的CRM链接http://mscrmtutorials.blogspot.in/2014/07/bulk-insert-and-bulk-update-in-ms-crm.html – 2014-11-13 17:41:21
你最终做了什么?我们使用kingswaysoft – 2017-01-16 20:43:13