Giter Club home page Giter Club logo

borisdj / efcore.bulkextensions Goto Github PK

View Code? Open in Web Editor NEW
3.5K 87.0 571.0 3.48 MB

Entity Framework EF Core efcore Bulk Batch Extensions with BulkCopy in .Net for Insert Update Delete Read (CRUD), Truncate and SaveChanges operations on SQL Server, PostgreSQL, MySQL, SQLite

Home Page: https://codis.tech/efcorebulk

License: Other

C# 97.12% TSQL 0.58% Batchfile 2.30%
entity-framework-core entityframeworkcore entityframework efcore bulk batch sql sqlbulkcopy sqlserver sqlite

efcore.bulkextensions's Introduction

EFCore.BulkExtensions

EntityFrameworkCore extensions (performance improvement):
-Bulk operations (super fast): Insert, Update, Delete, Read, Upsert, Sync, SaveChanges.
-Batch ops: Delete, Update - Deprecated from EF8 since EF7+ has native Execute-Up/Del.
-AddOp: Truncate.
Library is Lightweight and very Efficient (warp speed), having all mostly used CRUD operation.
Was selected in top 20 EF Core Extensions recommended by Microsoft.
Latest version is using EF Core 8.
Supports all 4 mayor sql databases: SQLServer, PostgreSQL, MySQL, SQLite.
Check out Testimonials from the Community and User Comments.
Icon>> and Logo (__):
&
(f.forward | rocket time)

Also take a look into others packages:
-Open source (MIT or cFOSS) authored .Net libraries (@Infopedia.io blog post)

.Net library Description
1 EFCore.BulkExtensions EF Core Bulk CRUD Ops (Flagship Lib)
2 EFCore.UtilExtensions EF Core Custom Annotations and AuditInfo
3 EFCore.FluentApiToAnnotation Converting FluentApi configuration to Annotations
4 FixedWidthParserWriter Reading & Writing fixed-width/flat data files
5 CsCodeGenerator C# code generation based on Classes and elements
6 CsCodeExample Examples of C# code in form of a simple tutorial

License

BulkExtensions licensed under Dual License v1.0 (solution to OpenSource funding, cFOSS: conditionallyFree OSS).
If you do not meet criteria for free usage of software with community license then you have to buy commercial one.
If eligible for free usage but still need active support, consider purchasing Starter Lic.

Support

If you find this project useful you can mark it by leaving a Github Star
And even with community license, if you want help development, you can make a DONATION:
Buy Me A Coffee _ or _ Button

Contributing

Please read CONTRIBUTING for details on code of conduct, and the process for submitting pull requests.
When opening issues do write detailed explanation of the problem or feature with reproducible example.

Description

Supported databases:
-SQLServer (or AzureSQL) under the hood uses SqlBulkCopy for Insert, Update/Delete = BulkInsert + raw Sql MERGE.
-PostgreSQL (9.5+) is using COPY BINARY combined with ON CONFLICT for Update.
-MySQL (8+) is using MySqlBulkCopy combined with ON DUPLICATE for Update.
-SQLite has no Copy tool, instead library uses plain SQL combined with UPSERT.
Bulk Tests can not have UseInMemoryDb because InMemoryProvider does not support Relational-specific methods.
Instead Test options are SqlServer(Developer or Express), LocalDb(if alongside Developer v.), or with other adapters.

Installation

Available on
That is main nuget for all Databases, there are also specific ones with single provider for those who need small packages.
Only single specific can be installed in a project, if need more then use main one with all providers.
Package manager console command for installation: Install-Package EFCore.BulkExtensions
Specific ones have adapter sufix: MainNuget + .SqlServer/PostgreSql/MySql/Sqlite ( )
Its assembly is Strong-Named and Signed with a key.

Nuget Target Used EF v. For projects targeting
8.x Net 8.0 EF Core 8 Net 8.0+
7.x Net 6.0 EF Core 7 Net 7.0+ or 6.0+
6.x Net 6.0 EF Core 6 Net 6.0+
5.x NetStandard 2.1 EF Core 5 Net 5.0+
3.x NetStandard 2.0 EF Core 3 NetCore(3.0+) or NetFrm(4.6.1+) MoreInfo
2.x NetStandard 2.0 EF Core 2 NetCore(2.0+) or NetFrm(4.6.1+)
1.x NetStandard 1.4 EF Core 1 NetCore(1.0+)

Supports follows official .Net lifecycle, currently v.8(LTS) as latest and v.7 and v.6(LTS).

Usage

It's pretty simple and straightforward.
Bulk Extensions are made on DbContext and are used with entities List (supported both regular and Async methods):

context.BulkInsert(entities);                 context.BulkInsertAsync(entities);
context.BulkInsertOrUpdate(entities);         context.BulkInsertOrUpdateAsync(entities);    //Upsert
context.BulkInsertOrUpdateOrDelete(entities); context.BulkInsertOrUpdateOrDeleteAsync(entiti);//Sync
context.BulkUpdate(entities);                 context.BulkUpdateAsync(entities);
context.BulkDelete(entities);                 context.BulkDeleteAsync(entities);
context.BulkRead(entities);                   context.BulkReadAsync(entities);
context.BulkSaveChanges();                    context.BulkSaveChangesAsync();

-MySQL when running its Test for the first time execute sql command (local-data): SET GLOBAL local_infile = true;
-SQLite requires package: SQLitePCLRaw.bundle_e_sqlite3 with call to SQLitePCL.Batteries.Init()

Batch Extensions are made on IQueryable DbSet and can be used as in the following code segment.
They are done as pure sql and no check is done whether some are prior loaded in memory and are being Tracked.
(updateColumns is optional param in which PropertyNames added explicitly when need update to it's default value)
Info about lock-escalation in SQL Server with Batch iteration example as a solution at the bottom of code segment.

// Delete
context.Items.Where(a => a.ItemId >  500).BatchDelete();
context.Items.Where(a => a.ItemId >  500).BatchDeleteAsync();

// Update (using Expression arg.) supports Increment/Decrement 
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(a => new Item { Quantity = a.Quantity + 100});
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(a => new Item {Quantity=a.Quantity+100});
  // can be as value '+100' or as variable '+incrementStep' (int incrementStep = 100;)
  
// Update (via simple object)
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(new Item { Description = "Updated" });
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(new Item { Description = "Updated" });
// Update (via simple object) - requires additional Argument for setting to Property default value
var updateCols = new List<string> { nameof(Item.Quantity) }; //Update 'Quantity' to default val:'0'
var q = context.Items.Where(a => a.ItemId <= 500);
int affected = q.BatchUpdate(new Item { Description="Updated" }, updateCols); //result assigned aff.

// Batch iteration (useful in same cases to avoid lock escalation)
do {
    rowsAffected = query.Take(chunkSize).BatchDelete();
} while (rowsAffected >= chunkSize);

// Truncate
context.Truncate<Entity>();
context.TruncateAsync<Entity>();

Performances

Following are performances (in seconds)

  • For SQL Server (v. 2019):
Ops\Rows EF 100K Bulk 100K EF 1 MIL. Bulk 1 MIL.
Insert 11 s 3 s 60 s 15 s
Update 8 s 4 s 84 s 27 s
Delete 50 s 3 s 5340 s 15 s

TestTable has 6 columns (Guid, string x2, int, decimal?, DateTime), all inserted and 2 were updated.
Test done locally on configuration: INTEL i7-10510U CPU 2.30GHz, DDR3 16 GB, SSD SAMSUNG 512 GB.
For small data sets there is an overhead since most Bulk ops need to create Temp table and also Drop it after finish.
Probably good advice would be to use Bulk ops for sets greater than 1000.

Bulk info

If Windows Authentication is used then in ConnectionString there should be Trusted_Connection=True; because Sql credentials are required to stay in connection.

When used directly each of these operations are separate transactions and are automatically committed.
And if we need multiple operations in single procedure then explicit transaction should be used, for example:

using (var transaction = context.Database.BeginTransaction())
{
    context.BulkInsert(entities1List);
    context.BulkInsert(entities2List);
    transaction.Commit();
}

BulkInsertOrUpdate method can be used when there is need for both operations but in one connection to database.
It makes Update when PK(PrimaryKey) is matched, otherwise does Insert.

BulkInsertOrUpdateOrDelete effectively synchronizes table rows with input data.
Those in Db that are not found in the list will be deleted.
Partial Sync can be done on table subset using expression set on config with method:
bulkConfig.SetSynchronizeFilter<Item>(a => a.Quantity > 0);
Not supported for SQLite (Lite has only UPSERT statement) nor currently for PostgreSQL. Way to achieve there sync functionality is to Select or BulkRead existing data from DB, split list into sublists and call separately Bulk methods for BulkInsertOrUpdate and Delete.

BulkRead (SELECT and JOIN done in Sql)
Used when need to Select from big List based on Unique Prop./Columns specified in config UpdateByProperties

// instead of WhereIN which will TimeOut for List with over around 40 K records
var entities = context.Items.Where(a=> itemsNames.Contains(a.Name)).AsNoTracking().ToList();//SQL IN
// or JOIN in Memory that loads entire table
var entities = context.Items.Join(itemsNames, a => a.Name, p => p,(a,p)=>a).AsNoTracking().ToList();

// USE
var items = itemsNames.Select(a => new Item { Name = a }).ToList(); // Items list with only Name set
var bulkConfig = new BulkConfig { UpdateByProperties = new List<string> { nameof(Item.Name) } };
context.BulkRead(items, bulkConfig); //Items list will be loaded from Db with data(other properties)

Useful config ReplaceReadEntities that works as Contains/IN and returns all which match the criteria (not unique).
Example of special use case when need to BulkRead child entities after BulkReading parent list.

SaveChanges uses Change Tracker to find all modified(CUD) entities and call proper BulkOperations for each table.
Because it needs tracking it is slower then pure BulkOps but still much faster then regular SaveChanges.
With config OnSaveChangesSetFK setting FKs can be controlled depending on whether PKs are generated in Db or in memory.
Support for this method was added in version 6 of the library.
Before calling this method newly created should be added into Range:

context.Items.AddRange(newEntities); // if newEntities is parent list it can have child sublists
context.BulkSaveChanges();

Practical general usage could be made in a way to override regular SaveChanges and if any list of Modified entities entries is greater then say 1000 to redirect to Bulk version.

Note: Bulk ops have optional argument Type type that can be set to type of Entity if list has dynamic runtime objects or is inherited from Entity class.

BulkConfig arguments

Bulk methods can have optional argument BulkConfig with properties (bool, int, object, List):

PROPERTY : DEFAULTvalue
----------------------------------------------------------------------------------------------
PreserveInsertOrder: true,                    PropertiesToInclude: null,
SetOutputIdentity: false,                     PropertiesToIncludeOnCompare: null,
SetOutputNonIdentityColumns: true,            PropertiesToIncludeOnUpdate: null,
LoadOnlyIncludedColumns: false,               PropertiesToExclude: null,
BatchSize: 2000,                              PropertiesToExcludeOnCompare: null,
NotifyAfter: null,                            PropertiesToExcludeOnUpdate: null,
BulkCopyTimeout: null,                        UpdateByProperties: null,
TrackingEntities: false,                      ReplaceReadEntities: false,
UseTempDB: false,                             EnableShadowProperties: false,
UniqueTableNameTempDb: true,                  
CustomDestinationTableName: null,             IncludeGraph: false,
CustomSourceTableName: null,                  OmitClauseExistsExcept: false,
CustomSourceDestinationMappingColumns: null,  DoNotUpdateIfTimeStampChanged: false,
OnConflictUpdateWhereSql: null,               SRID: 4326,
WithHoldlock: true,                           DateTime2PrecisionForceRound: false,
CalculateStats: false,                        TemporalColumns: { "PeriodStart", "PeriodEnd" },
SqlBulkCopyOptions: Default,                  OnSaveChangesSetFK: true,
SqlBulkCopyColumnOrderHints: null,            IgnoreGlobalQueryFilters: false,
DataReader: null,                             EnableStreaming: false,
UseOptionLoopJoin:false,                      ApplySubqueryLimit: 0
----------------------------------------------------------------------------------------------
METHOD: SetSynchronizeFilter<T>
        SetSynchronizeSoftDelete<T>

If we want to change defaults, BulkConfig should be added explicitly with one or more bool properties set to true, and/or int props like BatchSize to different number. Config also has DelegateFunc for setting Underlying-Connection/Transaction, e.g. in UnderlyingTest.
When doing update we can chose to exclude one or more properties by adding their names into PropertiesToExclude, or if we need to update less then half column then PropertiesToInclude can be used. Setting both Lists are not allowed.

When using the BulkInsert_/OrUpdate methods, you may also specify the PropertiesToIncludeOnCompare and PropertiesToExcludeOnCompare properties (only for SqlServer). By adding a column name to the PropertiesToExcludeOnCompare, will allow it to be inserted and updated but will not update the row if any of the other columns in that row did not change. For example, if you are importing bulk data and want to remove from comparison an internal CreateDate or UpdateDate, you add those columns to the PropertiesToExcludeOnCompare.
Another option that may be used in the same scenario are the PropertiesToIncludeOnUpdate and PropertiesToExcludeOnUpdate properties. These properties will allow you to specify insert-only columns such as CreateDate and CreatedBy.

If we want Insert only new and skip existing ones in Db (Insert_if_not_Exist) then use BulkInsertOrUpdate with config PropertiesToIncludeOnUpdate = new List<string> { "" }

Additionally there is UpdateByProperties for specifying custom properties, by which we want update to be done.
When setting multiple props in UpdateByProps then match done by columns combined, like unique constrain based on those cols.
Using UpdateByProperties while also having Identity column requires that Id property be Excluded.
Also with PostgreSQL when matching is done it requires UniqueIndex so for custom UpdateByProperties that do not have Un.Ind., it is temporarily created in which case method can not be in transaction (throws: current transaction is aborted; CREATE INDEX CONCURRENTLY cannot run inside a transaction block).
Similar is done with MySQL by temporarily adding UNIQUE CONSTRAINT.

If NotifyAfter is not set it will have same value as BatchSize while BulkCopyTimeout when not set has SqlBulkCopy default which is 30 seconds and if set to 0 it indicates no limit.
SetOutputIdentity have purpose only when PK has Identity (usually int type with AutoIncrement), while if PK is Guid(sequential) created in Application there is no need for them.
Also Tables with Composite Keys have no Identity column so no functionality for them in that case either.

var bulkConfig = new BulkConfig { SetOutputIdentity = true, BatchSize = 4000 };
context.BulkInsert(entities, bulkConfig);
context.BulkInsertOrUpdate(entities, new BulkConfig { SetOutputIdentity = true }); //e.g.
context.BulkInsertOrUpdate(entities, b => b.SetOutputIdentity = true); //BulkConfig with Action arg.

PreserveInsertOrder is true by default and makes sure that entities are inserted to Db as ordered in entitiesList.
When table has Identity column (int autoincrement) with 0 values in list they will temporary be automatically changed from 0s into range -N:-1.
Or it can be manually set with proper values for order (Negative values used to skip conflict with existing ones in Db).
Here single Id value itself doesn't matter, db will change it to next in sequence, what matters is their mutual relationship for sorting.
Insertion order is implemented with TOP in conjunction with ORDER BY. so/merge-into-insertion-order.
This config should remain true when SetOutputIdentity is set to true on Entity containing NotMapped Property. issues/76
When using SetOutputIdentity Id values will be updated to new ones from database.
With BulkInsertOrUpdate on SQLServer for those that will be updated it has to match with Id column, or other unique column(s) if using UpdateByProperties in which case orderBy done with those props instead of ID, due to how Sql MERGE works. To preserve insert order by Id in this case alternative would be first to use BulkRead and find which records already exist, then split the list into 2 lists entitiesForUpdate and entitiesForInsert without configuring UpdateByProps).
Also for SQLite combination of BulkInsertOrUpdate and IdentityId automatic set will not work properly since it does not have full MERGE capabilities like SqlServer. Instead list can be split into 2 lists, and call separately BulkInsert and BulkUpdate.

SetOutputIdentity is useful when BulkInsert is done to multiple related tables, that have Identity column.
After Insert is done to first table, we need Id-s (if using Option 1) that were generated in Db because they are FK(ForeignKey) in second table.
It is implemented with OUTPUT as part of MERGE Query, so in this case even the Insert is not done directly to TargetTable but to TempTable and then Merged with TargetTable.
When used Id-s will be updated in entitiesList, and if PreserveInsertOrder is set to false then entitiesList will be cleared and reloaded.
SetOutputNonIdentityColumns used only when SetOutputIdentity is set to true, and if this remains True (which is default) all columns are reloaded from Db.
When changed to false only Identity column is loaded to reduce load back from DB for efficiency.

Example of SetOutputIdentity with parent-child FK related tables:

int numberOfEntites = 1000;
var entities = new List<Item>();
var subEntities = new List<ItemHistory>();
for (int i = 1; i <= numberOfEntites; i++)
{
    var entity = new Item { Name = $"Name {i}" };
    entity.ItemHistories = new List<ItemHistory>()
    {
        new ItemHistory { Remark = $"Info {i}.1" },
        new ItemHistory { Remark = $"Info {i}.2" }
    };
    entities.Add(entity);
}

// Option 1 (recommended)
using (var transaction = context.Database.BeginTransaction())
{
    context.BulkInsert(entities, new BulkConfig { SetOutputIdentity = true });
    foreach (var entity in entities) {
        foreach (var subEntity in entity.ItemHistories) {
            subEntity.ItemId = entity.ItemId; // sets FK to match linked PK that was generated in DB
        }
        subEntities.AddRange(entity.ItemHistories);
    }
    context.BulkInsert(subEntities);
    transaction.Commit();
}

// Option 2 using Graph (only for SQL Server and only for simple relationship parent-child)
// - all entities in relationship with main ones in list are BulkInsertUpdated
context.BulkInsert(entities, b => b.IncludeGraph = true);
  
// Option 3 with BulkSaveChanges() - uses ChangeTracker so little slower then direct Bulk
context.Items.AddRange(entities);
context.BulkSaveChanges();

When CalculateStats set to True the result returned in BulkConfig.StatsInfo (StatsNumber-Inserted/Updated/Deleted).
If used for pure Insert (with Batching) then SetOutputIdentity should also be configured because Merge is required.
TrackingEntities can be set to True if we want to have tracking of entities from BulkRead or if SetOutputIdentity is set.
WithHoldlock means Serializable isolation level that locks the table (can have negative effect on concurrency).
_ Setting it False can optionally be used to solve deadlock issue Insert.
UseTempDB when set then BulkOperation has to be inside Transaction.
UniqueTableNameTempDb when changed to false temp table name will be only 'Temp' without random numbers.
CustomDestinationTableName can be set with 'TableName' only or with 'Schema.TableName'.
CustomSourceTableName when set enables source data from specified table already in Db, so input list not used and can be empty.
CustomSourceDestinationMappingColumns dict can be set only if CustomSourceTableName is configured and it is used for specifying Source-Destination column names when they are not the same. Example in test DestinationAndSourceTableNameTest.
EnableShadowProperties to add (normal) Shadow Property and persist value. Disables automatic discriminator, use manual method.
IncludeGraph when set all entities that have relations with main ones from the list are also merged into theirs tables.
OmitClauseExistsExcept removes the clause from Merge statement, required when having noncomparable types like XML, and useful when need to activate triggers even for same data.
_ Also in some sql collation, small and capital letters are considered same (case-insensitive) so for BulkUpdate set it false.
DoNotUpdateIfTimeStampChanged if set checks TimeStamp for Concurrency, ones with conflict will not be updated.
Return info will be in BulkConfig.TimeStampInfo object within field NumberOfSkippedForUpdate and list EntitiesOutput.
SRID Spatial Reference Identifier - for SQL Server with NetTopologySuite.
DateTime2PrecisionForceRound If dbtype datetime2 has precision less then default 7, example 'datetime2(3)' SqlBulkCopy does Floor instead of Round so when this Property is set then Rounding will be done in memory to make sure inserted values are same as with regular SaveChanges.
TemporalColumns are shadow columns used for Temporal table. Default elements 'PeriodStart' and 'PeriodEnd' can be changed if those columns have custom names.
OnSaveChangesSetFK is used only for BulkSaveChanges. When multiply entries have FK relationship which is Db generated, this set proper value after reading parent PK from Db. IF PK are generated in memory like are some Guid then this can be set to false for better efficiency.
ReplaceReadEntities when set to True result of BulkRead operation will be provided using replace instead of update. Entities list parameter of BulkRead method will be repopulated with obtained data. Enables functionality of Contains/IN which will return all entities matching the criteria (does not have to be by unique columns).
UseOptionLoopJoin when set it appends 'OPTION (LOOP JOIN)' for SqlServer, to reduce potential deadlocks on tables that have FKs. Use this sql hint as a last resort for experienced devs and db admins.
ApplySubqueryLimit Default is zero '0'. When set to larger value it appends: LIMIT 'N', to generated query. Used only with PostgreSql.

DataReader can be used when DataReader ia also configured and when set it is propagated to SqlBulkCopy util object.
EnableStreaming can be set to True if want to have tracking of entities from BulkRead or when SetOutputIdentity is set, useful for big field like blob, binary column.

SqlBulkCopyOptions is Enum (only for SqlServer) with [Flags] attribute which enables specifying one or more options:
Default, KeepIdentity, CheckConstraints, TableLock, KeepNulls, FireTriggers, UseInternalTransaction
If need to set Identity PK in memory, Not let DB do the autoincrement, then need to use KeepIdentity:
var bulkConfig = new BulkConfig { SqlBulkCopyOptions = SqlBulkCopyOptions.KeepIdentity };
Useful for example when copying from one Db to another.

OnConflictUpdateWhereSql To define conditional updates on merges, receives (existingTable, insertedTable).
--Example: bc.OnConflictUpdateWhereSql = (ex, in) => $"{in}.TimeUpdated > {ex}.TimeUpdated";
SetSynchronizeFilter A method that receives and sets expresion filter on entities to delete when using BulkInsertOrUpdateOrDelete. Those that are filterd out will be ignored and not deleted.
SetSynchronizeSoftDelete A method that receives and sets expresion on entities to update property instead od deleting when using BulkInsertOrUpdateOrDelete.
bulkConfig.SetSynchronizeSoftDelete<SomeObject>(a => new SomeObject { IsDeleted = true });

Last optional argument is Action progress (Example in EfOperationTest.cs RunInsert() with WriteProgress()).

context.BulkInsert(entitiesList, null, (a) => WriteProgress(a));

For parallelism important notes are:
-SqlBulk in Parallel
-Concurrent operations not run on same Context instance
-Import data to single unindexed table with tabel level lock

Library supports Global Query Filters and Value Conversions as well
Additionally BatchUpdate and named Property works with EnumToString Conversion
It can map OwnedTypes, also next are links with info how to achieve NestedOwnedTypes and OwnedInSeparateTable
On PG when Enum is in OwnedType it needs to have Converter explicitly configured in OnModelCreating

Table splitting are somewhat specific but could be configured in way Set TableSplit
With Computed and Timestamp Columns it will work in a way that they are automatically excluded from Insert. And when combined with SetOutputIdentity they will be Selected.
Spatial types, like Geometry, also supported and if Entity has one, clause EXIST ... EXCEPT is skipped because it's not comparable.
Performance for bulk ops measured with ActivitySources named: 'BulkExecute' (tags: 'operationType', 'entitiesCount')
Bulk Extension methods can be Overridden if required, for example to set AuditInfo.
If having problems with Deadlock there is useful info in issue/46.

TPH (Table-Per-Hierarchy) inheritance model can can be set in 2 ways.
First is automatically by Convention in which case Discriminator column is not directly in Entity but is Shadow Property.
And second is to explicitly define Discriminator property in Entity and configure it with .HasDiscriminator().
Important remark regarding the first case is that since we can not set directly Discriminator to certain value we need first to add list of entities to DbSet where it will be set and after that we can call Bulk operation. Note that SaveChanges are not called and we could optionally turn off TrackingChanges for performance. Example:

public class Student : Person { ... }
context.Students.AddRange(entities); //adding to Context so Shadow property 'Discriminator' gets set
context.BulkInsert(entities);

TPT (Table-Per-Type) way it is supported.

efcore.bulkextensions's People

Contributors

0xced avatar bennycoomans avatar bieyuan avatar blackflys avatar borisdj avatar christophervr avatar daujyungx avatar davidstjacques avatar dependabot[bot] avatar filmstarr avatar fretje avatar gandhis1 avatar herrkater avatar hoffs avatar jr01 avatar knalinne avatar konzen avatar leehom0123 avatar light-traveller avatar maitlandmarshall avatar numpsy avatar pariesz avatar pawelgerr avatar rwasef1830 avatar ryanthomas73 avatar sabrite avatar saeidbabaei-dev avatar sky-daniel-paula avatar tomislav-jelcic avatar woodworm83 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

efcore.bulkextensions's Issues

It would be nice to specify columns

One feature that would be useful is to specify the exact columns. Any column not listed would be ignored by BulkExtensions which would allow the database to set the default values instead of EFCore. For example, I am having an issue while upserting an Sql Server timestamp column where the default value is generated in the temp table and then fails when merging into the actual table because timestamps can't be explicitly inserted.

I would like to allow BulkConfig to override the list of columns to update.

Bulk Insert Case sensitivity problem with column names

bulk insert treats column names case sensitive somewhere
therefore if there is even a single case difference in column name in modelBuilder, the Bulk Inser fails.
for example userID vs userId fails immediately

entity.Property(e => e.userId)
                    .HasColumnName("userID")
                    .HasMaxLength(100);

database column name is userId.
C# code refers as userID

Double update when bulkupdate

Hello, having trouble when use bulk update inside transaction:

What happening:

  1. Inserting new entities
   await dbContext.BulkInsertAsync(newClasses, new BulkConfig()
            {
                SetOutputIdentity = true,
                WithHoldlock = false,
                SqlBulkCopyOptions = SqlBulkCopyOptions.UseInternalTransaction
            });
  1. Each entry in state Unmodified. Modified each inserted entity
foreach (var newClass in newClasses){

                newClass.TableName = "newString"
            }

  1. Each entry in state Modified. Making bulk update
 await dbContext.BulkUpdateAsync(newClasses, new BulkConfig
            {
                SetOutputIdentity = true,
                WithHoldlock = false
            });
  1. Update passed, db updated, but each entry of dbContext styll in Modified state
  2. Making SaveChange cause ef tracker to update all entities

And one extra question why bulk update can block whole dbset inside of ReadCommited Transaction?

Computed column error when SetOutputIdentity is set to true during BulkInsert

Can you please confirm the following issue?

I noticed BulkInsert fails if you have computed column in table and you are using SetOutputIdentity = true.

Code for loading the data:

public IEnumerable<TestResult> SeedTestResults(IEnumerable<TestResult> data)
{
	///This code was auto generated, if you make any change in this method then next time when the code is regenerated your changes will be overwritten 
	if (data.Any() == false)
		return data;

	using (var transaction = Database.BeginTransaction())
	{
		var items = data.ToList();
                //No effect of PropertiesToExclude for Insert
		//List<string> excludedProperties = null;
		//excludedProperties = new List<string>();
		//excludedProperties.Add("TestStatus");

		this.BulkInsert(items, new BulkConfig { SetOutputIdentity = true, PreserveInsertOrder = true /*, PropertiesToExclude = excludedProperties*/ });
		var relatedItems = new List<TestMeasurement>();
		foreach (var item in items)
		{
			foreach (var subItem in item.TestMeasurements)
			{
				subItem.TestResultId = item.TestResultId;
			}
			relatedItems.AddRange(item.TestMeasurements);
		}
		this.BulkInsert(relatedItems, new BulkConfig { SetOutputIdentity = true, PreserveInsertOrder = true });
		transaction.Commit();
		return data;
	}	
}

Following is the merge query that gets generated:

MERGE dbo.[TestResults] WITH (HOLDLOCK) AS T USING (SELECT TOP 60 * FROM dbo.[TestResultsTemp4b97610a] ORDER BY [TestResultId]) AS S ON T.[TestResultId] = S.[TestResultId] WHEN NOT MATCHED THEN INSERT ([BuildCode], [CustomerSequenceNumber], [ErrorMessage], [JobId], [RecipeId], [ShiftId], [StationId], [TestDate], [TestStatus], [TotalTimeInSeconds]) VALUES (S.[BuildCode], S.[CustomerSequenceNumber], S.[ErrorMessage], S.[JobId], S.[RecipeId], S.[ShiftId], S.[StationId], S.[TestDate], S.[TestStatus], S.[TotalTimeInSeconds]) OUTPUT INSERTED.[TestResultId], INSERTED.[BuildCode], INSERTED.[CustomerSequenceNumber], INSERTED.[ErrorMessage], INSERTED.[JobId], INSERTED.[RecipeId], INSERTED.[ShiftId], INSERTED.[StationId], INSERTED.[TestDate], INSERTED.[TestStatus], INSERTED.[TotalTimeInSeconds] INTO dbo.[TestResultsTemp4b97610aOutput];

Table script

CREATE TABLE [dbo].[TestResults](
	[TestResultId] [int] IDENTITY(1,1) NOT NULL,
	[JobId] [nvarchar](50) NULL,
	[BuildCode] [nvarchar](50) NULL,
	[CustomerSequenceNumber] [nvarchar](50) NULL,
	[TestStatus]  AS (case when ltrim(rtrim(isnull([ERRORMESSAGE],'')))='' then 'Pass' else 'Fail' end) PERSISTED NOT NULL,
	[ErrorMessage] [nvarchar](512) NULL,
	[TestDate] [datetime] NOT NULL,
	[TotalTimeInSeconds] [decimal](18, 2) NOT NULL,
	[StationId] [int] NOT NULL,
	[ShiftId] [int] NOT NULL,
	[RecipeId] [int] NULL,
 CONSTRAINT [PK_TestResults] PRIMARY KEY CLUSTERED 
(
	[TestResultId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
...
...
...
--removed indexes and other details for brevity

OnModelCreating code for TestResult table

modelBuilder.Entity<TestResult>(entity =>
{
	entity.Property(e => e.TestStatus)
		  .HasConversion<string>();

	entity.Property(e => e.ErrorMessage).HasMaxLength(512);

	entity.Property(e => e.JobId).HasMaxLength(50);

	entity.Property(e => e.TestDate).HasColumnType("datetime");

	entity.Property(e => e.TestStatus)
		.IsRequired()
		.HasMaxLength(4)
		.IsUnicode(false)
		.HasComputedColumnSql("(case when ltrim(rtrim(isnull([ERRORMESSAGE],'')))='' then 'Pass' else 'Fail' end)");

	entity.HasOne(d => d.Recipe)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.RecipeId)
		.HasConstraintName("FK_TestResults_Recipes");

	entity.HasOne(d => d.Shift)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.ShiftId)
		.OnDelete(DeleteBehavior.ClientSetNull)
		.HasConstraintName("FK_TestResults_PlantShifts");

	entity.HasOne(d => d.Station)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.StationId)
		.OnDelete(DeleteBehavior.ClientSetNull)
		.HasConstraintName("FK_TestResults_Stations");
});

Support for Owned Types (EF Core)

Class configuration

    public class CentraStageAlert : Entity<string>
    {
        public CentraStageAlert()
        {
            AlertMonitorInfo = new CentraStageAlertMonitorInfo();
            AlertSourceInfo = new CentraStageAlertSourceInfo();
        }

        public CentraStageAlertMonitorInfo AlertMonitorInfo { get; set; } // owned type

       // rest of props
    }
    public class CentraStageAlertMonitorInfo
    {
        public bool SendsEmails { get; set; }
        public bool CreatesTicket { get; set; }
    }

Entity Configuration

    public class CentraStageAlertConfiguration : IEntityTypeConfiguration<CentraStageAlert>
    {
        public void Configure(EntityTypeBuilder<CentraStageAlert> builder)
        {
            builder
                .Property(p => p.Id)
                .ValueGeneratedNever();

            builder
                .OwnsOne(p => p.AlertMonitorInfo);
        }
    }

The following exception is thrown when using the InsertOrUpdate method.

SqlException: Cannot insert the value NULL into column 'AlertMonitorInfo_CreatesTicket', table 'PortalCore.dbo.CentraStageAlerts'; column does not allow nulls. UPDATE fails.
The statement has been terminated.

After a bit of investigating myself, i think its this line which is part of the problem

var allProperties = entityType.GetProperties().AsEnumerable();

As the GetProperties() method does not return the Navigation properties that are present. I think a combination of GetNavigations() and IsOwned() would be needed to get all the required properties.

Is adding support for Owned Types be something you're interested in? If so i can have a crack at a PR, i'm just not overly familiar with the SqlBulkCopy side of things.

Merge Matching on Null

I'm trying to BulkInsertOrUpdate with the defined list of UpdateByProperties. The only way I'm able to get it to update instead of insert is when the UpdateByProperties list contains all non-null values for a row. So for example, a row of data consists of Field1, Field2, Value, with Field1 and Field2 as UpdateByProperties. If a row of data that exists currently has Field1 = 'a' and Field2 = 'b' and the new data matches this, it will correctly update Value. But if FieldB is null in both the existing row and the new row, a second row gets inserted.

Is there a property I should be using to get around this or is there a way that I can get nulls to be matched?

Support for custom ValueConverters (EF Core 2.1)

EF Core 2.1 introduces the concept of converters that allow you to convert to/from different data types in the DB. Do you have plans to incorporate these into the bulk extensions? See here for more information.

tableInfo.BulkConfig.UseTempDB = false;

Would it be possible for you to remove the following line of code? The code works fine without it if used within explicit transaction. The line causes issue in Sql Server since it requires account to have high privilege like db_owner or similar since normal tables are used as temp tables in merge operation and have to be created and dropped.

if (operationType != OperationType.Insert)
            {
                tableInfo.BulkConfig.UseTempDB = false; // TempDB can only be used with Insert.
                // Other Operations done with customTemp table.
                // If using tempdb[#] throws exception: 'Cannot access destination table' (gets Droped too early, probably because transaction ends)
            }

Columns with spaces cause BulkInsertOrUpdate to fail

**How to reproduce: ** Create a table that has a column name with spaces in it. Then, try to run a query.

This can actually be easily fixed by detecting whitespace in the Column names and wrapping them with brackets [] when whitespace is detected.

NOTE: This does not occur on a mere BulkInsert, but a BulkInsertOrUpdate call.

Parent child relationship

I have quick question, does parent child relationship work while saving the data?

For example, I have following classes:

public partial class Application
{
    public Application()
    {
        Stations = new HashSet<Station>();
    }

    public int ApplicationId { get; set; }
    public string ApplicationName { get; set; }

    public ICollection<Station> Stations { get; set; }
}

public partial class Station
{
    public Station()
    {
        TestResults = new HashSet<TestResult>();
    }

    public int StationId { get; set; }
    public string StationName { get; set; }
    public int ApplicationId { get; set; }

    public Application Application { get; set; }
    //other properties removed for brevity
}

Below code is inside OnModelCreating

modelBuilder.Entity<Application>(entity =>
{
    entity.Property(e => e.ApplicationName)
        .IsRequired()
        .HasMaxLength(50);
});

modelBuilder.Entity<Station>(entity =>
{
    entity.HasIndex(e => e.StationName)
        .HasName("IX_Stations")
        .IsUnique();

    entity.Property(e => e.StationName)
        .IsRequired()
        .HasMaxLength(50);

    entity.HasOne(d => d.Application)
        .WithMany(p => p.Stations)
        .HasForeignKey(d => d.ApplicationId)
        .OnDelete(DeleteBehavior.ClientSetNull)
        .HasConstraintName("FK_Stations_Applications");
});

Following is my input for seeding the Applications and Stations table.

void SeedApplications()
{
	using (var repository = RepositoryFactory.GetRepository())
	{
		var applications = repository.SeedApplications(new Application[] {
			new Application
			{   ApplicationName = "Front EOL", ApplicationId = 0, Stations = new Station[]
				{
					new Station{ StationName = "1600N" ,StationId = 0, ApplicationId = 0 },
					new Station{ StationName = "1600S" ,StationId = 0, ApplicationId = 0 },
					new Station{ StationName = "1700S" ,StationId = 0, ApplicationId = 0 },
				}
			},
			new Application
			{   ApplicationName = "FAM", ApplicationId = 0, Stations = (new Station[]
				{
					new Station{ StationName = "1650N",StationId = 0, ApplicationId = 0  },
					new Station{ StationName = "1700N",StationId = 0, ApplicationId = 0  },
					new Station{ StationName = "1750N",StationId = 0, ApplicationId = 0  }
				}
				)
			}			
		});
	}
}

Following is a class to create a Repository instance:

static class RepositoryFactory
{
	public static Repository GetRepository()
	{
		DbContextOptionsBuilder b = new DbContextOptionsBuilder();
		b.UseSqlServer(System.Configuration.ConfigurationManager.ConnectionStrings["TestResultServer"].ConnectionString);
		return new Repository(b.Options);
	}
}

Actual repository class:

public partial class Repository : DbContext
{
        public virtual DbSet<Application> Applications { get; set; }
        public virtual DbSet<Station> Stations { get; set; }

	public IEnumerable<Application> SeedApplications(IEnumerable<Application> data)
	{
		///This code was auto generated, if you make any change in this method then next time when the code is regenerated your changes will be overwritten 
		if(data.Any() == false)
			return data;
		var items = data.ToList();
		using (var transaction = Database.BeginTransaction())
		{
			Applications.AddRange(items);
			this.BulkInsert(items, new BulkConfig {  SetOutputIdentity = true });
			transaction.Commit();
			return items;
		}		
	}
}

I was hoping that after inserting Applications records, Stations records will be inserted too. But that never happens. Following is my debug window output which shows after insert Stations property is cleared.

Before bulk insert:
beforeinsert

After bulk insert:
after

If use regular Insert from EF Core it just works fine (code which is commented in the above screen shot). The problem only occurs if I use BulkInsert method.

I am wondering if you can please clear up my doubt and let me know if this is possible or not :)

Thanks.

Getting Deadlock Issues while doing Bulk insert

We are using the bulk insert package and in regular tests, the inserts go through fine but in load tests where multiple web jobs are trying to do bulk inserts, we are getting deadlocks. Any pointers or settings that you can tell us how to solve the issue

Cannot drop the table

Hello,

I encoutered an issue using EFCore BulkExtensions.
The db user is set as db_owner on the related database

image
Does anyone has any clue on that ?

image

Code I use for that :
image

If you need any information, do not hesitate.

PS : It's not about rights, I can delete a table with Navicat / SQL Management Studio / MSSQL Tools with the same credentials

Btw, thanks for your awesome work, it's saving my life <3

Support for SQLLITE?

Hello,

Is there any support for SQLLite, if not, is there a plan to do so in the future? and when?

Kind regards

FireTriggers Option

I needed the ability to fire triggers on my tables when doing bulk inserts. I don't actually know how to do a pull request, but below is the code (Seperated by file) that I modified to allow me to do that if anyone else needs to ability too. It would be nice if this were in the package. This also gives the ability for All the SQLBulkCopyOptions. Essentially I added a BulkCopyOptions into BulkConfig and anywhere BulkConfig was being used I modified it to accept BulkCopyOptions for a parameter. Note, I did not remove the KeepIdentity property which is a SQLBulkCopyOptions option, but there are now 0 references too it. I also left the original code (commented out) in GetSqlBulkCopy() simply because I only tested it for my scenario and wasn't sure if it followed the same logic path that is expected for other scenarios.

BulkConfig.cs

using System.Collections.Generic;
using System.Data.SqlClient;

namespace EFCore.BulkExtensions
{
    public class BulkConfig
    {
        public bool PreserveInsertOrder { get; set; }

        public bool SetOutputIdentity { get; set; }

        public int BatchSize { get; set; } = 2000;

        public int? NotifyAfter { get; set; }

        public int? BulkCopyTimeout { get; set; }

        public bool EnableStreaming { get; set; }

        public bool UseTempDB { get; set; }

        public bool KeepIdentity { get; set; }

        public List<string> PropertiesToInclude { get; set; }

        public List<string> PropertiesToExclude { get; set; }

        public List<string> UpdateByProperties { get; set; }

        public SqlBulkCopyOptions BulkCopyOptions { get; set; }
    }
}

SqlBulkOperation.cs

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<decimal> progress)
        {
            var sqlConnection = OpenAndGetSqlConnection(context);
            var transaction = context.Database.CurrentTransaction;
            try
            {
                using (var sqlBulkCopy = GetSqlBulkCopy(sqlConnection, transaction, tableInfo.BulkConfig.BulkCopyOptions))
                {
                    tableInfo.SetSqlBulkCopyConfig(sqlBulkCopy, entities, progress);
                    using (var reader = ObjectReaderEx.Create(entities, tableInfo.ShadowProperties, context, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                    {
                        sqlBulkCopy.WriteToServer(reader);
                    }
                }
            }
            finally
            {
                if (transaction == null)
                {
                    sqlConnection.Close();
                }
            }
        }

        public static async Task InsertAsync<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<decimal> progress)
        {
            var sqlConnection = await OpenAndGetSqlConnectionAsync(context);
            var transaction = context.Database.CurrentTransaction;
            try
            {
                using (var sqlBulkCopy = GetSqlBulkCopy(sqlConnection, transaction, tableInfo.BulkConfig.BulkCopyOptions))
                {
                    tableInfo.SetSqlBulkCopyConfig(sqlBulkCopy, entities, progress);
                    using (var reader = ObjectReaderEx.Create(entities, tableInfo.ShadowProperties, context, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                    {
                        await sqlBulkCopy.WriteToServerAsync(reader).ConfigureAwait(false);
                    }
                }
            }
            finally
            {
                if (transaction == null)
                {
                    sqlConnection.Close();
                }
            }
        }

private static SqlBulkCopy GetSqlBulkCopy(SqlConnection sqlConnection, IDbContextTransaction transaction, SqlBulkCopyOptions options = SqlBulkCopyOptions.Default)
        {
            if (transaction == null)
            {
                return new SqlBulkCopy(sqlConnection, options, null);
            }
            else
            {
                var sqlTransaction = (SqlTransaction)transaction.GetDbTransaction();
                return new SqlBulkCopy(sqlConnection, options, sqlTransaction);
            }
            //if (transaction == null)
            //{
            //    if (keepIdentity)
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity, null);
            //    else
            //        return new SqlBulkCopy(sqlConnection);
            //}
            //else
            //{
            //    var sqlTransaction = (SqlTransaction)transaction.GetDbTransaction();
            //    if (keepIdentity)
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity, sqlTransaction);
            //    else
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.Default, sqlTransaction);
            //}
        }

System.ArgumentOutOfRangeException when BulkInsertOrUpdate

Hi,

I got System.ArgumentOutOfRangeException: 'StartIndex cannot be less than zero.'
when run below codes.

   _db.BulkInsertOrUpdate(hotelSegments, new BulkConfig { BatchSize = hotelSegments.Count });

Stack Trace:

  at System.String.Remove(Int32 startIndex, Int32 count)
  at EFCore.BulkExtensions.SqlQueryBuilder.GetCommaSeparatedColumns(List`1 columnsNames, String prefixTable, String equalsTable)
  at EFCore.BulkExtensions.SqlQueryBuilder.MergeTable(TableInfo tableInfo, OperationType operationType)
  at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType, Action`1 progress)
  at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertOrUpdate[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)

Cannot drop the table 'nameTable, because it does not exist or you do not have permission.

i have this problem when i send to save

Cannot drop the table 'dbo.nameTableTempf21dbf8f', because it does not exist or you do not have permission.
i supose that is the error in my conectiongstring?
on the other hands my save is very slow is aprox seven minutes.

` _context.Database.SetCommandTimeout(360);
//listTemp = listTemp.OrderBy(T => T.Id).ToList();
// _context.Database.SetCommandTimeout(0);

            using (var transaction = _context.Database.BeginTransaction())
            {
                //_context.BulkInsert(listTemp, new BulkConfig {  SetOutputIdentity = true, BatchSize = 4000,NotifyAfter=240 });
                _context.BulkInsert(listTemp, new BulkConfig { SetOutputIdentity = true, NotifyAfter = 360 });

                transaction.Commit();
            }`

Support for non public setter of PK and 'PreserveInsertOrder'

Hi,

I have an entity where the primary key has a public get and private set:

public class Entity 
{
    public int Id {get; private set;}
}

I'm not able to do a bulk insert on this with PreserverInsertOrder, because of this:

protected void UpdateEntitiesIdentity<T>(IList<T> entities, IList<T> entitiesWithOutputIdentity)
{
if (this.BulkConfig.PreserveInsertOrder) // Updates PK in entityList
{
var accessor = TypeAccessor.Create(typeof(T));
for (int i = 0; i < this.NumberOfEntities; i++)
accessor[entities[i], this.PrimaryKeys[0]] = accessor[entitiesWithOutputIdentity[i], this.PrimaryKeys[0]];
}

It creates the accessor that handles only fully public properties.

I think that the fix should be as simple as changing to following:

TypeAccessor.Create(typeof(T), true);

Bulk Insert Should Update ID's

I am doing a bulk insert but would then like to get the entities I am inserting to have their primary keys updated. Right now, they stay as zero. I would like to do this so I can bulk insert related entities but I need to set the foreign key.

How can I achieve this?

Nuget 2.0.9 'reference not set to an instance' Question, and DbSchema keyword support [SquareBrackets]

I am trying to bulk insert some data according to example given on here but it is throwing exception that I can't trace.

Here is the code

using(var tr = await _context.Database.BeginTransactionAsync())
            {
                await _context.BulkInsertAsync(model);
                await _context.SaveChangesAsync();
                tr.Commit();
            }

It throws exception on _context.BulkInsertAsync

Below is the stack trace

{System.NullReferenceException: Object reference not set to an instance of an object.

at EFCore.BulkExtensions.TableInfo.<LoadData>b__65_2[T](IProperty a)
 at System.Linq.Enumerable.Any[TSource](IEnumerable`1 source, Func`2 predicate)
 at EFCore.BulkExtensions.TableInfo.LoadData[T](DbContext context, Boolean loadOnlyPKColumn)
 at EFCore.BulkExtensions.TableInfo.CreateInstance[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig)
 at EFCore.BulkExtensions.DbContextBulkTransaction.ExecuteAsync[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig, Action`1 progress)
 at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertAsync[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)
 at AccountService.<model>d__17.MoveNext() in C:\Project\AccountService.cs:line 303
--- End of stack trace from previous location where exception was thrown ---`

Thanks

Temp tables staying in database

Hi Borris. I noticed periodically in my code that the temp tables the Bulk insert creates end up not being dropped. the thing that bothers me, is that I now have a piece of code where it happens every time, what is the solution?

EntityFramework 6.x

I'm guessing EFCore.BulkExtensions is specific to EFCore. Any plans to release a version for old school EF 6.x?

Type Conversion failure in CheckHasIdentity()

It would seem the cast that's happening right in the while loop causes an exception in cases where there are no identities. I'm not entirely sure if this is something weird propagating in the environment I'm working with or if this is merely something else..

Steps to Reproduce

  1. Create a table that has a PK column which lacks any identity/auto-increment attributes so it sits as a normal column with the rest.
  2. Set BulkConfig to SetOutputIdentity to true. (i.e. new BulkConfig { SetOutputIdentity = true })
  3. Attempt a BulkInsert call. (haven't tested but this likely affects the other insertion methods as well)

Solution / Notes

In any case, the offending line is occurring in TableInfo.cs right here.

For fixing the problem, I was able to simply alter the statement to the following:
hasIdentity = reader[0] == DBNull.Value ? 0 : 1;

Though I haven't exactly spent time investigating this issue further so if it ends up being some kind of weird false positive due to my environment then I do apologize in advance for occupying your time. Please let me know if there's any further information you would need.

Thanks!

Opened connections are not closed properly

If you try to ~200-300 insertations you will fail with exception - can not open new connection.

It can be fixed easy - wrap sqlBulkCopy into using in SqlBulkOperation.cs Insert method:

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<double> progress = null, int batchSize = 2000)
        {
            var sqlBulkCopy = new SqlBulkCopy(context.Database.GetDbConnection().ConnectionString)
            {
                DestinationTableName = tableInfo.InsertToTempTable ? tableInfo.FullTempTableName : tableInfo.FullTableName,
                BatchSize = batchSize,
                NotifyAfter = batchSize
            };

            using (sqlBulkCopy) // add using here
            {
                sqlBulkCopy.SqlRowsCopied += (sender, e) => { progress?.Invoke(e.RowsCopied / entities.Count); };

                foreach (var element in tableInfo.PropertyColumnNamesDict)
                {
                    sqlBulkCopy.ColumnMappings.Add(element.Key, element.Value);
                }
                using (var reader = ObjectReader.Create(entities, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                {
                    sqlBulkCopy.WriteToServer(reader);
                }
            }
        }

Fix here: commit

Connection is not opened - BulkMerge

In EFCore.BulkExtensions.TableInfo.CheckHasIdentity there are conn.OpenAsync(). I think you have to await here, or use sync method

System.InvalidOperationException: ExecuteReader requires an open and available Connection. The connection's current state is connecting.
   at System.Data.SqlClient.SqlConnection.GetOpenTdsConnection(String method)
   at System.Data.SqlClient.SqlConnection.ValidateConnectionForExecute(String method, SqlCommand command)
   at System.Data.SqlClient.SqlCommand.ValidateCommand(Boolean async, String method)
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite, String method)
   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)
   at EFCore.BulkExtensions.TableInfo.CheckHasIdentity(DbContext context) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\TableInfo.cs:line 61
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 49
   at Auto1.Prices.Repositories.ProductRepository.InsertOrUpdate(IEnumerable`1 entities) in D:\Projects\Auto1.Prices\Auto1.Prices\Repositories\ProductRepository.cs:line 55
   at Auto1.Prices.Controllers.ExchangeController.UpdateProducts(ExchangeProduct[] products) in D:\Projects\Auto1.Prices\Auto1.Prices\Controllers\ExcangeController.cs:line 49
   at lambda_method(Closure , Object , Object[] )
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeActionMethodAsync>d__27.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextActionFilterAsync>d__25.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ActionExecutedContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextExceptionFilterAsync>d__24.MoveNext()    at System.Data.SqlClient.SqlConnection.GetOpenTdsConnection(String method)
   at System.Data.SqlClient.SqlConnection.ValidateConnectionForExecute(String method, SqlCommand command)
   at System.Data.SqlClient.SqlCommand.ValidateCommand(Boolean async, String method)
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite, String method)
   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)
   at EFCore.BulkExtensions.TableInfo.CheckHasIdentity(DbContext context) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\TableInfo.cs:line 61
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 49
   at Auto1.Prices.Repositories.ProductRepository.InsertOrUpdate(IEnumerable`1 entities) in D:\Projects\Auto1.Prices\Auto1.Prices\Repositories\ProductRepository.cs:line 55
   at Auto1.Prices.Controllers.ExchangeController.UpdateProducts(ExchangeProduct[] products) in D:\Projects\Auto1.Prices\Auto1.Prices\Controllers\ExcangeController.cs:line 49
   at lambda_method(Closure , Object , Object[] )
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeActionMethodAsync>d__27.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextActionFilterAsync>d__25.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ActionExecutedContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextExceptionFilterAsync>d__24.MoveNext()

Given ColumnMapping is invalid

I am getting this error and was wondering, do I need to implement every column in the database in my entity for this to work?

I have to craft a series of bulk update services for a project, but I only want to pass in a list of the users ids and the new value all the users will be updated to.

Can I not create an entity that doesn't expose all the columns and have BulkUpdate work?

BulkUpdate error with Connect/ShowPlan Db Permission

I have a table that contains a unique, non-clustered, filter index. There are no identity columns or triggers.

When I try to do a bulk update I receive a "CREATE TABLE permission denied in database" error.

The database access is using SQL authentication and user is granted Connect and Show plan permissions.

My current workaround is to grant Alter permission, however according to related bcp utility documentation:

A bcp in operation minimally requires SELECT/INSERT permissions on the target table. In addition, ALTER TABLE permission is required if any of the following is true:

  • Constraints exist and the CHECK_CONSTRAINTS hint is not specified.
  • Triggers exist and the FIRE_TRIGGER hint is not specified.
  • You use the -E option to import identity values from a data file.

I was expecting to specify only the SqlBulkCopyOptions.CheckConstraints flag so that Alter permission would not be required, however the error still occurs.

Additionally I have successfully been able to do bulk inserts with the same user into a different table without a unique index.

Temp Table Clean-Up Question

Hello,

I'm using BulkInsertOrUpdate and unrelated to this library I am sometimes running to data scenarios that cause the operation to fail and then an exception is thrown in my code. Is there a setting I can enable to ensure that this library cleans up any tables that were created in the event of an exception during the operation?

BulkDelete throws NullReferenceException in 2.0.9

After upgrading from 2.0.8 to 2.0.9, I started getting NullReferenceException from some of my tests which use the BulkDelete functionalities. It works fine when I reverted the package back to 2.0.8.

Exception:
System.NullReferenceException: 'Object reference not set to an instance of an object.'

Stack trace:

at EFCore.BulkExtensions.TableInfo.b__65_2[T](IProperty a)
at System.Linq.Enumerable.Any[TSource](IEnumerable1 source, Func2 predicate)
at EFCore.BulkExtensions.TableInfo.LoadData[T](DbContext context, Boolean loadOnlyPKColumn)
at EFCore.BulkExtensions.TableInfo.CreateInstance[T](DbContext context, IList1 entities, OperationType operationType, BulkConfig bulkConfig) at EFCore.BulkExtensions.DbContextBulkTransaction.Execute[T](DbContext context, IList1 entities, OperationType operationType, BulkConfig bulkConfig, Action1 progress) at EFCore.BulkExtensions.DbContextBulkExtensions.BulkDelete[T](DbContext context, IList1 entities, BulkConfig bulkConfig, Action1 progress) at MyService.SqlService.BulkCopy.SqlBulkCopyService1.BulkDelete[TEntity](Expression`1 deleteFilter)
at MyService.PerfTests.Search.SearchLiveDbFixture..ctor()

Here's my service wrapper which uses the BulkDelete functionality:

public void BulkDelete<TEntity>(
    Expression<Func<TEntity, bool>> deleteFilter)
    where TEntity : class
{
    var dbSet = _dbContext.Set<TEntity>();
    IQueryable<TEntity> entitiesToDelete;
    if (deleteFilter != null)
    {
        entitiesToDelete = dbSet.Where(deleteFilter).AsNoTracking();
    }
    else
    {
        entitiesToDelete = dbSet.AsNoTracking();
    }

    _dbContext.BulkDelete(entitiesToDelete.ToList());
}

Additional info:

  • _dbcontext.Model doesn't seem to be null
  • entitiesToDelete is not null, the exception is thrown whether the list is empty / not

Any ideas?

Need To be careful when using BulkInsertOrUpdate With Composite Key

When using the BulkInsertOrUpdate operation, you have to be careful if your primary key is a Composite key, let's say you pass in a list of records, some of them are new and need to be created (Inserted), one part of the key is same as an existing key (i.e. Part of the composite key), the operation will treat it as an existing record and will overwrite it, instead of inserting a new on, I got burned by that and spent hours banging my head.

Throws exception on BulkInsert when SetOutputIdentity = true and the Primary Key db column name is different than the actual model property

I have a model defined like so

public class Entity
{
    [Key]
    [DatabaseGenerated(DatabaseGeneratedOption.Identity)]
    public int Id { get; set; }

    //some other properties
}

However, using the fluent api in the context, the Id property is actually stored in the database as EntityId (name of the class + "Id").

When using SetOutputIdentity = true, it throws an exception saying "Invalid column: 'Id'", so it appears to be using the name from the model and not the actual column name somewhere.

Code Exception when insert, but data inserted into db

Hi,
I have two problems and was hoping someone can point me to the right direction. I am trying to insert and get the Id populated by using the 'SetOutputIdentity = true' flag with v2.0.7.

  1. I got an exception 'There is already an open DataReader associated with this Command which must be closed first.' The data was inserted into SQL Server with the exception.
    image

  2. I am having the same issue as the below where the Id property does not change to reflect the Id on the table.
    https://github.com/borisdj/EFCore.BulkExtensions/issues/18

Any suggestions?

Thanks,
ED

"The given ColumnMapping does not match up with any column in the source or destination" - When using GetDataTable<T>

First i was receiving this exception

NotSupportedException: DataSet does not support System.Nullable<>.
Module "System.Data.DataColumn", line 164, col 0, in .ctor
Void .ctor(System.String, System.Type, System.String, System.Data.MappingType)
Module "System.Data.DataColumnCollection", line 8, col 0, in Add
System.Data.DataColumn Add(System.String, System.Type)
File "C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs", line 175, col 21, in GetDataTable
System.Data.DataTable GetDataTable[T](Microsoft.EntityFrameworkCore.DbContext, System.Collections.Generic.IList`1[T])
File "C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs", line 75, col 25, in MoveNext
Void MoveNext()
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", line 12, col 0, in Throw
Void Throw()
Module "System.Runtime.CompilerServices.TaskAwaiter", line 46, col 0, in HandleNonSuccessAndDebuggerNotification
Void HandleNonSuccessAndDebuggerNotification(System.Threading.Tasks.Task)

Which seems to be caused by this line and the column was of type DateTime?

dataTable.Columns.Add(columnName, property.PropertyType);

i managed to solve this by changing the line to
dataTable.Columns.Add(columnName, Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType);

But then it throws this exception

System.InvalidOperationException: The given ColumnMapping does not match up with any column in the source or destination.
at System.Data.SqlClient.SqlBulkCopy.AnalyzeTargetAndCreateUpdateBulkCommand(BulkCopySimpleResultSet internalResults)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource1 source) --- End of stack trace from previous location where exception was thrown --- at EFCore.BulkExtensions.SqlBulkOperation.InsertAsync[T](DbContext context, IList1 entities, TableInfo tableInfo, Action`1 progress) in C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 78

On this line

await sqlBulkCopy.WriteToServerAsync(dataTable);

Any ideas what could be causing this?

Batch Size

Thank you kindly for this library.

Previously I was using http://entityframework-extensions.net/?z=codeplex which had the ability to specify a batchSize. I found this was essential for my project as sometimes we need to insert millions of records which will fail unless we can batch them up. Any chance of adding BatchSize to this library?

Class structures with Discriminator aren't supported

Trying to do BulkInsert with a structure that maps several types to one table using some Discriminator field results in the IndexOutOfRange exception in FastMember.
This is obviously due to fact that Discriminator isn't present in instances.

Does it support the InMemory provider?

I want to move away from our current solution (a combination of temp tables, SqlBulkCopy and merge statements) to increase testability. The majority of the tests are already using the InMemory provider. Will this library support it?

Does the update selectively update only the affected columns or the whole record

I have been using Bulk update in my program extensively, it’s a very useful tool.

After some time we began using the program actively, and users realized that saved data is being lost, namely overwritten.

A user can change an item price from $10 to $20 and after 15 minutes or so the price is back to the original value, like a phantom flipping switching inside the program.

After some soul searching I started to think what is the problem.

We have a lot of background processes which are running every hour in order to get updated info from Amazon, namely buy box price, selling price, fba fees, etc.

The process goes like this:
The program gets a list of items from the table and loads them into memory using regular EF Core, it modifies them and updates the table by passing in the list of objects to the bulkupdate Function.

Now let’s say this scenario:
Process1, which is modifying the buy-box column, starts at 3:00 p.m., it runs for 20 minutes. It gets a bunch of objects from the database and does its work.

5 minutes later, 3:05 p.m., process2 starts, it get’s a list of objects from the database and starts it’s work of modifying the selling price column.

It finishes in 5 minutes, and on 3:15 p.m. it updates the database. Let’s say the selling price of “item1”, was modified in process2, from $10 to $5.

But “item1” is also in the list which process1 took out of the database, now at 3:20 p.m., process1 sends back the updates to the database, using bulk update. It has many buy-box fields modified, but it overwrites the whole record with the old values it collected while getting the items from the database on 3:00p p.m., Including the info that the selling price of “item1” is $10, like it was at 3:00 p.m., and those overwriting the new selling price value which is in the database, as updates by proces2.

So my question is, does the update selectively update only the affected columns or the whole record

Please comment.

Naftaly

Lost password in connectionString

BulkInsert method works fine. But BulkInsertOrUpdate throws an exception like below.

System.Data.SqlClient.SqlException occurred
  HResult=0x80131904
  Message=Login failed for user 'user'.
  Source=Core .Net SqlClient Data Provider
  StackTrace:
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\SqlBulkOperation.cs:line 81
   at EFCore.BulkExtensions.DbContextBulkTransaction.Execute[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\DbContextBulkTransaction.cs:line 22
   at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertOrUpdate[T](DbContext context, IList`1 entities, BulkConfig bulkConfig) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\DbContextBulkExtensions.cs:line 15

Seems like the reason is context.Database.GetDbConnection().ConnectionString returns connectionString without the password.

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<double> progress = null, int batchSize = 2000)
{
    var sqlBulkCopy = new SqlBulkCopy(context.Database.GetDbConnection().ConnectionString)
    {
        DestinationTableName = tableInfo.InsertToTempTable ? tableInfo.FullTempTableName : tableInfo.FullTableName,
        BatchSize = batchSize,
        NotifyAfter = batchSize
    };
    sqlBulkCopy.SqlRowsCopied += (sender, e) => { progress?.Invoke(e.RowsCopied / entities.Count); };

    foreach (var element in tableInfo.PropertyColumnNamesDict)
    {
        sqlBulkCopy.ColumnMappings.Add(element.Key, element.Value);
    }
    using (var reader = ObjectReader.Create(entities, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
    {
        sqlBulkCopy.WriteToServer(reader);
    }
}

The reason is that after invoke CheckHasIdentity method, DbContext somehow lost it's password.

public void CheckHasIdentity(DbContext context)
{
    int hasIdentity = 0;
    var conn = context.Database.GetDbConnection();
    try
    {
        conn.OpenAsync();
        using (var command = conn.CreateCommand())
        {
            string query = SqlQueryBuilder.SelectIsIdentity(FullTableName, PrimaryKey);
            command.CommandText = query;
            DbDataReader reader = command.ExecuteReader();

            if (reader.HasRows)
            {
                while (reader.Read())
                {
                    hasIdentity = (int)reader[0];
                }
            }
            reader.Dispose();
        }
    }
    finally
    {
        conn.Close();
    }

    HasIdentity = hasIdentity == 1;
}

Thanks.

BulkInsert doesn't work inside transaction. Tries to force open a new connection.

using (var tran = Context.Database.BeginTransaction())
{
   Context.BulkInsert(assets);
   tran.Commit();
}

Calling BulkInsert like this will result to the following exception:

{System.InvalidOperationException: The connection was not closed. The connection's current state is open.
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.Open()
   at EFCore.BulkExtensions.SqlBulkOperation.Insert[T](DbContext context, IList`1 entities, TableInfo tableInfo, Action`1 progress)
   at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsert[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)

If I try to close the connection, which already seems hazardous action:

using (var tran = Context.Database.BeginTransaction())
{
   Context.Database.GetDbConnection().Close();
   Context.BulkInsert(assets);
   tran.Commit();
}

I get the following result:

This SqlTransaction has completed; it is no longer usable.

at System.Data.SqlClient.SqlTransaction.ZombieCheck()
at System.Data.SqlClient.SqlTransaction.Commit()
at Microsoft.EntityFrameworkCore.Storage.RelationalTransaction.Commit()

I did some digging into the source code of yours, and i found out that BulkInsert tries to open a connection without checking that if one is already open. I also found out that the SqlBulkCopy is not getting the reference of an transaction even if one is ongoing already.

I played around with your project a bit and made it to work with my needs, but the question is that is this behavior something which is by design or can this be a bug?

Create extension method

I'm trying to create an override for the BulkInsert function. In my dbcontext I use an override on SaveChanges() where i edit several properties of the object and return the base.SaveChanges();

Is it possible to create this for the bulklinsert?

Could BulkUpdate specified conditions?

Hey:)
Thx for your EFCore.BulkExtensions very much. However, could BulkUpdate specify conditions?
I want sql like this:

update [dbo].[table1] 
set [IsActive] = arg1
where [Id] = arg2 and [OrderId] = arg3

And, could BulkUpdate increase count? for example:

update [dbo].[table1]
set [count] = [count] + 1
where ...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.