0

upd.: Here is a Syntax that finaly solved the issue: 'createSQL = "SELECT * INTO newlyСreatedTableName FROM [fileName.csv] IN 'folderPath' [""Text;HDR=YES;FMT=Delimited""]"

Struggling with data transferring. I have a .csv with a 6 mln rows and trying to import it to Access DB running code in Excel. I have a simple code.

Function getAccCN(ByVal dbFullPath As String) As Object
    Set getAccCN = CreateObject("ADODB.Connection")
    getAccCN.connectionString="Provider=Microsoft.ACE.OLEDB.16.0;Data Source=" & dbFullPath
End Function

Function createSQL() As String
    createSQL = "Select * INTO [" & _ 
                fileName & "] FROM [Data source = " & _
                repFile.ParentFolder.Path & _ 
                "\; Extended Properties=""Text;HDR=Yes;FMT=Delimited""]." & _ 
                repFile.Name ' repFile is a *.csv as "Scripting.FileSystemObject"

    Debug.Print createSQL ' returns following:
     ' Select * INTO [Classification] FROM [Data source = \\av-fs01.av.local\profiles$\...\Project IQVIA\; Extended Properties="Text;HDR=Yes;FMT=Delimited"].Classification.csv
     ' *.accdb and the table name and  *.csv have the same base name - "Classification"
End Function

Function uploadCSV() as Boolean
Dim CN as Object
    Set CN = getAccCN(repFile.ParentFolder.Path & "\" & baseFileName & ".accdb")
    CN.Open
    CN.Execute createSQL() ' this creates Error
    Exit Function
ErrHandler:
    Debug.Print Err.Number ' = -2147467259  "Wrong argument."
       'CN.ERROR - Arguments are of the wrong type, out of range, or conflict with each other.
End Function

Headers and first rows of the source.

enter image description here

So, I can't find out how to solve the issue. I'would be very thankful for anyhelp.

11
  • What is the size of the 6 mln CSV file? More than 2GB (msAccess table limit)? Commented Jul 31, 2022 at 13:12
  • @ALeXceL it's 0.5 GB Commented Jul 31, 2022 at 13:31
  • FMT=Delimited(,)? Commented Jul 31, 2022 at 14:07
  • @BZngr FMT=Delimited(,) returns the same error. 1 line .csv I added the screenshot of the source data Commented Jul 31, 2022 at 14:32
  • @ALeXceL, I have built Access tables of 9+ million rows well under Access 2GB limit, closer to 1GB. At three columns, OP can handle this operation. Even better if OP uses a lookup table with code description for class code for only two column table and avoid the repetition of text. Commented Jul 31, 2022 at 15:26

3 Answers 3

2

As the problem is the amount of rows to be imported at once, I suggest using Power Query to import and, in it, perform the desired work (Power Query is designed for that). After that, divide the 6 million rows into 6 queries, each of which can be imported into Excel or served as a basis for exporting to another application. I did a test here and it worked: I created a CSV file with 6M lines containing the text "This is a 6M line file and this is the line (lngPtr)". Once imported into Power Query, I split the single column into 4 others and finally applied Table.Split, dividing the table into 6 (1M rows each), according to the M script below (example to be adapted):

let
    Font = Csv.Document(File.Contents("C:\temp\6MRows.CSV"),[Delimiter=",", Columns=1, Encoding=1252]),
    #"Split Table" =Table.Split(Font,1000000)
in
    #"Split Table"

After splitting the CSV file, 6 "Table" will appear in PQ. Then add them into a 6 New Query. In less than 20 minutes the CSV was into PQ (Excel365, Win10Pro, I7/16Gb)

Sign up to request clarification or add additional context in comments.

2 Comments

thank you for the answer. It's a nice idea. Can you add an idea of how to import PQ table into SQL Server, I didn't experienced it ), may be in few words.
Neither do I, I'm sorry. But since each of these 6 can be loaded into a spreadsheet, from there you can load them into SQLServer.
1

Please try this and feedback.

Sub csv_Import()
Dim wsheet As Worksheet, file_mrf As String
Set wsheet = ActiveWorkbook.Sheets("Single")
file_mrf = Application.GetOpenFilename("Text Files (*.csv),*.csv", , "Provide Text or CSV File:")
With wsheet.QueryTables.Add(Connection:="TEXT;" & file_mrf, Destination:=wsheet.Range("B2"))
  .TextFileParseType = xlDelimited
  .TextFileCommaDelimiter = True
  .Refresh
End With
End Sub

9 Comments

Thank you for the answer. I want to transfer add data about 6 mln rows to access db. With a QueryTable object I can't find the way to trasfer it.
What is '6 mln rows'? Do you mean 6 million rows? You may need PowerQuery, or a real database. Access can only handle 2GB of data. There are so many other alternatives out there, many of which can handle hundreds of millions of rows, or billions of rows, or more.
It's a 0,5 GB file. The final destination is MS SQL Server. The thing is that I can't use BULK insert due to restrictions. I used a textstream with parsing but it is 2 hours of loading. For a one file it's ok, but it can be several of them every month. the idea is to get it into Access then transfer to DWH. And I want to avoid the import wizard of Server Studio
Will you do any processing on the data read before loading it into SQLServer? What kind of treatment? Filter, format, record deletion?
@ASH, at 3 columns, Access can store many rows of bit size data within its 2GB limit. See this old post: Maximum number of rows in an MS Access database engine table?
|
0

Here is a Syntax that solved the issue:

'createSQL = _
     "SELECT * INTO newlyСreatedTableName" & _ 
     " FROM [fileName.csv] IN 'folderPath' [""Text;HDR=YES;FMT=Delimited""]"

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.