Jump to content
Server Maintenance This Week. ×

Find a character across multiple fields


Recommended Posts

Hi everyone,

I'm currently running a CSV export from FileMaker using the Write To Datafile script step. The data is exported and then uploaded to MailChimp via Make.com. It works great, unless a comma is entered into a FileMaker field, which breaks the CSV structure. I'd prefer not to add much data entry validation on the fields (there are 20) and just run a script right before the export and replace any commas. I'm using a Quickfind on a specific layout to find all the commas in the databse (about 2,000 records), and it does the job. Is there a way to go directly to the field that has the comma? I don't want to do this:

 

image.thumb.jpeg.70d8f5022041c104321e8b6f7fbaa7f6.jpeg

 

 

Any suggestions and ideas are appreciated!

 

 

Link to comment
Share on other sites

I would attack it right by the entry in the base, via the autoenter instead. Let me give you an example of how I usually treat phone numbers, which even might be pasted in from a website:

Filter(Evaluate ( Quote ( Self ) );"0123456789") 

...the reasoning behind the Evaluate( is, that colours as well as font's otherwise could pasted right into the base and litter it. You can get pretty specific, with what to change ... but it takes different shenanigans to deal various fields.

Another way to consider is to use the: Save/Send Record as Excel, because you by designing a specific layout for this purpose can use some of the invisible tools this command uses, to get rid of these gremlins.

But take a look at this custom function, it does the opposite of Filter( where you allowing the input to be sifted through a string, with the custom function do you enter values to look out for and then getting rid of them:

https://www.briandunning.com/cf/922 

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

Thank you, Soren. My biggest concern is an address field where a variety of characters could be used and I don't want to omit anything important. 

 

You are correct, a lot of the info is a cut and paste from various sources, so it could be useful here.

Link to comment
Share on other sites

1 hour ago, Fred_S said:

I'm currently running a CSV export from FileMaker using the Write To Datafile script step.

Why can't you use the native export feature? Then Filemaker will quote all exported fields, and any legitimate commas within them will be preserved.

Note that commas are not the only reserved character in CSV; you also need to pay attention to double-quotes and carriage returns.

 

Link to comment
Share on other sites

Hi Comment,

It's part of an automated process using Make.com. In order to get the data into ClickSend (I said MailChimp in error earlier), I have FileMaker server compile the data and send it to a Mailhook as an attachment. The attachment needs to be a CSV (sadly). Once it is sent over, Make.com does the work via the ClickSend API.

image.thumb.jpeg.d5fdab71708b9fa749bf5a946fc84dc3.jpeg

 

I'm wondering, would Substitute be a better option for an address field if that's all I'm really looking to fix?

Link to comment
Share on other sites

Just now, Fred_S said:

The attachment needs to be a CSV (sadly).

So? The CSV format is perfectly capable of carrying data with commas in it. And Filemaker has a built-in export feature that will create a properly* formatted CSV file for you, without you having to do all this extra work and - more importantly - without you having to mutilate your data by removing perfectly legitimate commas.

--
(*) With one exception, which I won't go into now (and which can be fixed).

 

Link to comment
Share on other sites

Posted (edited)

 

13 hours ago, comment said:

So? The CSV format is perfectly capable of carrying data with commas in it. And Filemaker has a built-in export feature that will create a properly* formatted CSV file for you, without you having to do all this extra work and - more importantly - without you having to mutilate your data by removing perfectly legitimate commas.

--
(*) With one exception, which I won't go into now (and which can be fixed).

 

 

You make a very valid point @comment . The problem I am having is that the Make.com module called Parse CSV Data will not accept the CSV file being created with the Export Records Script Step from the FileMaker Server. I have no clue why Make.com doesn't like the file, I have spent hours trying to get it to work, but it dies on the first row with good data. Make.com throws an error that the row count is wrong (maybe it doesn't see the line feed?).  I've tried several formats such as ASCII and Unicode, you name it, it does not work. The Server is Windows (a FMPhost server) so I am really baffled and open to any suggestions, the fewer moving parts the better! 😁

The only way I can seem to get it to work is by using Write to Data File. For each record, I use a Calculated field that will create a CSV row with quotes and I append that to the Data File. Originally, I was used a CSV with no enclosed quotes like this:

FirstName, LastName, Email, HomeAddress
FirstName, LastName, Email, HomeAddress
FirstName, LastName, Email, HomeAddress

But once a comma was put into the HomeAddress field, it would kill the import because the row count would be thrown off wrong.

I fixed it now with enclosed commas to work like this:

"FirstName", "LastName", "Email", "HomeAddress"
"FirstName", "LastName", "Email", "HomeAddress"
"FirstName", "LastName", "Email", "HomeAddress"
 

Here's part of the code:

image.thumb.jpeg.24bc52bbbc3aa39165944363c875f15e.jpeg

 

Edited by Fred_S
Adding quote
Link to comment
Share on other sites

I am somewhat confused by your description.

If you are surrounding each field with quotes, then the field can contain a comma - so your original quest becomes irrelevant. OTOH, you need to make sure that any quotes within the field are escaped by doubling them.

As I said, the native export feature will do both for you. 

39 minutes ago, Fred_S said:

I have no clue why Make.com doesn't like the file

I would start by loading the 2 files (the "good" one produced by your current process, and the "bad" one produced by native exporting) into a text editor and finding the difference. Then we can see what can be done to eliminate this difference. 

--
Unrelated to the current issue, but:
You are using the Insert Calculated Result[] script step where you should be using Set Field[]. The Set Field[] step operates entirely at the data level (where you, as the developer, should perform all your tasks unless you are assisting the user in data entry). The Insert Calculated Result[] script step operates at layout level and will not work if the field is not on the current layout.

 

Link to comment
Share on other sites

18 hours ago, Fred_S said:

Is there a way to go directly to the field that has the comma?

There is by using a custom function like this: https://www.briandunning.com/cf/810

...where you after the quick find, by this calc: 

mapexpr ( "Position ( Evaluate(_v) ; \",\" ; 1 ; 1 ) xor 0"; MiddleValues ( FieldNames ( "untitled" ; "" ) ; 6 ; 999 ) )

Returning a pilcrow delimited list of 0's and 1's showing in which field the comma should be found in ... here would I perhaps use indirection ... using the flared 1 to point at the field in question. The MiddleValues( is there to avoid searching in the "house keeping fields" as timestamp and user-indification, and by it unlikely candidates for the desired export.

But it should only be a measure to deal with already littered data, I still believe it should have been dealt with during the data entry in the first place.

BTW is Mailchimps storage of data conflicting with the EU's GDPR legislation ....

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

On 4/9/2024 at 2:39 AM, comment said:

I am somewhat confused by your description.

If you are surrounding each field with quotes, then the field can contain a comma - so your original quest becomes irrelevant. OTOH, you need to make sure that any quotes within the field are escaped by doubling them.

As I said, the native export feature will do both for you. 

I would start by loading the 2 files (the "good" one produced by your current process, and the "bad" one produced by native exporting) into a text editor and finding the difference. Then we can see what can be done to eliminate this difference. 

--
Unrelated to the current issue, but:
You are using the Insert Calculated Result[] script step where you should be using Set Field[]. The Set Field[] step operates entirely at the data level (where you, as the developer, should perform all your tasks unless you are assisting the user in data entry). The Insert Calculated Result[] script step operates at layout level and will not work if the field is not on the current layout.

 

Hi @comment, sorry about the confusion. When I create a CSV file using the Create Data File method, it works fine and uploads to the Make.com destination, a CSV parser. But, when I use the built in Export function, Export it as a CSV, the Make.com parser says that the first row has the wrong column count. Attached is a file that does not work called Produces Error.csv, it is created by FileMaker Server and sent via email. The No error file was created by hand on Mac using TextEditor, and it works fine.  Both are UTF-8 and I just can't determine what the issue might be.

Thanks for the direction on using Set Field instead of Insert Calculated Result. Old habit 🙂

Produces Error.csv no error.csv

On 4/9/2024 at 5:10 AM, Søren Dyhr said:

There is by using a custom function like this: https://www.briandunning.com/cf/810

...where you after the quick find, by this calc: 

mapexpr ( "Position ( Evaluate(_v) ; \",\" ; 1 ; 1 ) xor 0"; MiddleValues ( FieldNames ( "untitled" ; "" ) ; 6 ; 999 ) )

Returning a pilcrow delimited list of 0's and 1's showing in which field the comma should be found in ... here would I perhaps use indirection ... using the flared 1 to point at the field in question. The MiddleValues( is there to avoid searching in the "house keeping fields" as timestamp and user-indification, and by it unlikely candidates for the desired export.

But it should only be a measure to deal with already littered data, I still believe it should have been dealt with during the data entry in the first place.

BTW is Mailchimps storage of data conflicting with the EU's GDPR legislation ....

--sd

Thank you @Søren Dyhr . I see what you mean. Maybe I can just add a script trigger to the fields in question to replace the commas? The MailChimp data will be used in the U.S. only.

Link to comment
Share on other sites

43 minutes ago, Fred_S said:

Both are UTF-8 and I just can't determine what the issue might be.

I would prefer to see two files with identical data, so that the only differences are in the format structure. Still, one difference is obvious and it is also what I suspected: the file produced by exporting uses carriage returns as line endings; while the file produced by writing to a data file uses line feeds.

This is also confirmed by what the help says:

Quote

Commas separate field values. Carriage return characters separate records. Quotation marks enclose all field values.

https://help.claris.com/en/pro-help/content/comma-separated-text-format.html

Quote

Append line feed adds a line feed character to the end of the data written to the file.

https://help.claris.com/en/pro-help/content/write-to-data-file.html


IMHO, the simplest way to produce the wanted file is to export the records as CSV, then read the resulting file as data file, change the line endings, and write the result back to the exported file. You can also add the header at the same time.

Attached is a demo file that does all of that.

Do note that the data stored in the database is not modified in any way during this process. There is no need to remove the commas, since the exported fields are automatically quoted. And even if there were such requirement, it would be performed on the exported data only, not on the original as you were attempting to do at the beginning. That would be just wrong.

Similarly, there is no need to have the header text, stored or calculated, inside the source table. If it's a custom header, it can be stored in the script. If it's alright to use the actual field names, you can export as .mer instead of .csv.

 

ChangeLineEndings.fmp12

  • Like 1
Link to comment
Share on other sites

6 hours ago, Fred_S said:

Maybe I can just add a script trigger to the fields in question to replace the commas?

Please excuse my limited imagination here, if the damage is done, and the data is littered are autoenters clearly wasted.

How would you approach it? On record load ... using that the quickfind already have established a found set? How would you distinguish between simple browsing and the found set established by a quick find? Something like $$disable_triggers or ....

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

On 4/8/2024 at 7:30 PM, Fred_S said:

Is there a way to go directly to the field that has the comma? I don't want to do this:

There is - if the damage is done:

--sd

Skærmbillede 2024-04-11 kl. 17.33.38.png

Link to comment
Share on other sites

@Søren Dyhr

1. This does not "go directly to the field that has the comma"; it loops among all the fields on the layout until it finds one with a comma. 

2. You could have done all that jazz with a single Perform Find/Replace script step.

3. I won't ask what if there is a field that's not on the layout, but is in the export field order.

4. I will ask this:

Why do you keep barking up this obviously wrong tree??? Do you really think that fields with commas cannot survive being exported to CSV? And, even supposing that you do, do you really think that the correct solution is to purge those commas permanently from the original data inside those fields? Really???

 

 

Edited by comment
Link to comment
Share on other sites

1 hour ago, comment said:

Why do you keep barking up this obviously wrong tree???

Because there were a talk about script triggers, which means that the strategic consideration haven't begun to make sense yet.

 

9 hours ago, Fred_S said:

Maybe I can just add a script trigger to the fields in question to replace the commas?

It could be said in another way made by a famous military thinker:

Quote

Strategy without tactics is the slowest route to victory. Tactics without strategy is the noise before defeat.

... and the case here, being a inherited solution - where something should be done with what has happened, hence the originally made question ... since time isn't available to more than damage control. I do not care if the diagnostics is done correctly in any of the organisations, i work for - is done properly ... I care to solve the problem at hand no matter if the person who's assigned me is out of his mind, nothing really calls out for idealism because it means you're unemployable.

 

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

19 hours ago, comment said:

2. You could have done all that jazz with a single Perform Find/Replace script step.

Nice one, close but no cigar - i've clocked it on a set of data from Brian Dunning, and admittedly not searching for a comma, but instead a lowercase "a" for what its worth in the comparison. 

My script clocked in at 211847, while your suggestion 2. clocked in at 218352 - when using Get ( CurrentTimeUTCMicroseconds )

I turned then to scripted replaces, which suffers from the direct lack of Indirection and would then need to use Evaluate( and "headless" assignments:

 

Skærmbillede 2024-04-12 kl. 13.06.59.png

But since a large number of extra calculations often are made in vain - did it clock in at 330731, even though the script is somewhat simpler, in its approach.

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

I don't think this is relevant to the topic (as I said several times already).

But since there is some value to it in general:
I don't know what is the size of the data set you used. I see that the smallest set at https://www.briandunning.com/sample-data/ is 500 records x 12 fields. If you measured a difference of 6.5 milliseconds, I would not consider it significant. Especially if you only performed the test once.

Now will you finally give it a rest?

 

Link to comment
Share on other sites

Posted (edited)
On 4/11/2024 at 2:33 AM, comment said:

I would prefer to see two files with identical data, so that the only differences are in the format structure. Still, one difference is obvious and it is also what I suspected: the file produced by exporting uses carriage returns as line endings; while the file produced by writing to a data file uses line feeds.

This is also confirmed by what the help says:

https://help.claris.com/en/pro-help/content/comma-separated-text-format.html

https://help.claris.com/en/pro-help/content/write-to-data-file.html


IMHO, the simplest way to produce the wanted file is to export the records as CSV, then read the resulting file as data file, change the line endings, and write the result back to the exported file. You can also add the header at the same time.

Attached is a demo file that does all of that.

Do note that the data stored in the database is not modified in any way during this process. There is no need to remove the commas, since the exported fields are automatically quoted. And even if there were such requirement, it would be performed on the exported data only, not on the original as you were attempting to do at the beginning. That would be just wrong.

Similarly, there is no need to have the header text, stored or calculated, inside the source table. If it's a custom header, it can be stored in the script. If it's alright to use the actual field names, you can export as .mer instead of .csv.

 

ChangeLineEndings.fmp12 200 kB · 5 downloads

Thank you, @comment this fixed the issue! Over at Make.com, someone also identified the issue as linefeed vs carriage return. I must have missed the text in the help documentation because I surely looked. I will make a note of it for future reference. Thank you for the guidance!

Edited by Fred_S
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.