December 1, 20196 yr I recently converted to using Office 2019 as a format for file import - previously I had used LibreOffice either as dif or Excel 95-2004 etc Now, however, when I import a file created in Office 2019 and try to import the records (say, 1000 records) FM thinks it is importing 65k records - I have resort to escaping during the import after a specified number which is hit or miss and then deleting the blank records. What's happened? Why is this an issue ?
December 1, 20196 yr 18 minutes ago, enquirerfm said: using Office 2019 as a format for file import I presume you mean the .xlsx Excel Workbook format (which is known under many names, but "Office 2019" is not one of them). 30 minutes ago, enquirerfm said: FM thinks it is importing 65k records Apparently FM detects data in 65k rows. 33 minutes ago, enquirerfm said: What's happened? Why is this an issue ? Why don't you post an example of the source file, along with your import mapping? Without it, all you will get are guesses.
December 1, 20196 yr This has always happened to me, too for a once a week import from QuickBooks. My two options were: 1. Let it go as it takes a total of 30 seconds. Through a script I basically import to its own table, find the relevant records, delete the rest, import into the proper table. 2. Go into Excel and clean up the sheet, which takes 20 seconds, then save the sheet. Then the import happens instantaneously. So basically the same. I haven't found an easy, quick and/or automatic way to clean up the spreadsheet. The only way I can clean it up is to highlight all the relevant data, copy, new sheet, Paste Values only. Then import that sheet only. Edited December 1, 20196 yr by Steve Martino
December 22, 20196 yr Author So I found the issue... I had created a formula field which concatenated the contents of various fields. If I double clicked the box to copy it down automatically it created this issue. If I simply highlighted the box and dragged it down the number of fields it wasn't a problem.
Create an account or sign in to comment