I try to read the XML file in DataTable using ReadXml .

 DataSet ds = new DataSet(); ds.ReadXml("myxml.xml"); DataTable table = new DataTable(); table = ds.Tables[0]; 

It works great on small files, but when downloading files larger than 1GB, an error starts

"System.OutOfMemoryException" in mscorlib.dll

DataTable in the subsequent transfer to load in the database, is it possible to somehow read the "portions" of this "portion" of the database, clear, then read again, and so on until the end of the file?

  • Data of this size must be loaded not through the DataSet (placing it in memory), but through the bulk operation that your database provider provides. - mals
  • I need to load into SQL Compact CE, there is no such loading in the standard one, and the third-party library accepts DataTable , List<T> , here is the link to the library - e1s
  • @andreycha is my question) just thinking in Russian you can explain more clearly what is required - e1s
  • @ e1s ha :). Well, there they understood you correctly and correctly answered: with pars file handles. - andreycha

3 answers 3

Go to the x64 process. Otherwise, load the data through bulk operations from a file.

  • I think this is not a bit, just a lot of resources are required. - e1s

Try using the BufferedStream to boot:

 DataSet ds = new DataSet(); using (FileStream stream = File.OpenRead("myxml.xml")) { using (BufferedStream buffered = new BufferedStream(stream)) { ds.ReadXml(buffered); } } 

It reads the file slightly, not completely. The truth is that you can already rest on the fact that the loaded DataSet will take up too much memory. But this is another question :).

  • So I tried. Apparently the DataSet growing and this error is falling out, how can you read it in chunks? - e1s
  • @ e1s then only with your hands: read data from a file using an XmlReader, accumulate a certain number of records, and bulk insert it into the database. - andreycha

Read the file through the XmlReader class https://msdn.microsoft.com/ru-ru/library/system.xml.xmlreader(v=vs.110).aspx

 XmlTextReader textReader = new XmlTextReader("file.xml"); while (textReader.Read()) { XmlNodeType nType = textReader.NodeType; if (nType == XmlNodeType.XmlDeclaration) { // ... } if (nType == XmlNodeType.Comment) { // ... } if (nType == XmlNodeType.Attribute) { // ... } if (nType == XmlNodeType.Element) { // ... } if (nType == XmlNodeType.DocumentType) { // ... } } 
  • So there is no such method there, it’s just that XML is full of attributes and it’s time consuming to create a table for storage. - e1s
  • Try the XmlDocument class, it can read both the reader and the Stream. Nodes, etc. you can get from it by properties, for example, "ChildNodes Returns all the child nodes of this node. (Inherited from XmlNode.)" Well, in general, here’s the direction for you :) I’m not engaged in it, so I’ll not give you more details. - IvanZakirov
  • The stream can be adjusted by reading bytes, etc ... in general, there is where to dig. - IvanZakirov
  • There is no need to do anything with your hands (I can’t comment there because of my low reputation). Read the data in an XmlDocument, or in a variable as a stream initially (to memory with data and not cause failures in the DLL), then gradually consider it in XMLDocument, from its properties, etc. ... make an XML schema that you consider as the class DataTable.ReadXML and DataTable.ReadXMLScheme (well, or something like that. - IvanZakirov
  • Also on MSDN "How to create a stream of XML fragments with access to header information" msdn.microsoft.com/ru-ru/library/bb387008.aspx "How to perform stream conversion of large XML documents" msdn.microsoft.com/ru- com / library / bb387013.aspx - IvanZakirov