Jump to content

Virtual List with large data sets


This topic is 2250 days old. Please don't post here. Open a new topic instead.

Recommended Posts

It seems that the way I have implemented Virtual Lists bogs down on large ( 10k+ ) record sets. Building the $$data variable works fast enough ( LAN, 7 seconds for 2k records to 13s for 3k ), but then building the $$col variables takes what seems like exponentially more time ( 11 seconds for 1K records, then 41 seconds for 2K records, 82s for 3K...

Is there a way to improve the performance?

 

Screen Shot 2016-04-22 at 10.57.32 am.png

Link to comment
Share on other sites

I've run a lot of virtual list stuff on data sets between 3 - 10k records, and it is nearly instant. It kind of depends on what you are doing with it.

What is the "Value_GetFromPipeDelimited" function do? Or more accurately, what is the calc?

Also, what is the use-case for setting 36 columns of variables? Not arguing for or against, curious what you are doing with it. :)

Link to comment
Share on other sites

It's a CF to retrieve data from a data row that's pipe delimited:

// Retrieves the value from a pipe delimited data set at postion X
// Useful for parsing FQL query that uses the | as the delimiter
// Note - Requires a final pipe on the data set 

Let( [
    Begin = Position(Data; "|"; 1; Number - 1) + 1;
    End = Position(Data; "|"; 1 ; Number) - Begin
];

Middle(Data; begin;  end))

And the use case is that I then need to export that set as a .csv for ingestion into a 3rd party system - I've got no control over the final format, it just needs to be 36 columns in the right order....

Link to comment
Share on other sites

Jeez people keep doing this.

Do NOT build a single 10K row blob of text.

Use a global repeating variable.

Each line of the list is $$data[N]

And build rows, not columns.

  • Like 1
Link to comment
Share on other sites

Bruce,

I have read some of your posts promoting $$data[N]. I can't find any! Can you please provide a link to a thread that discusses the use of $$data[N].

Thanks,

Barbara

Link to comment
Share on other sites

Excellent, Wim, that's the post I remembered.

 

Nope, not the discussion I thought it was. No mention, afaik, of using $$var[N]. Also, does this discussion pre-date the List Of summary type? That is my preferred way of capturing IDs from a found set.

Link to comment
Share on other sites

In its most basic form, a virtual list has two fields:

N - an integer, stored, going from 1 to (as many records as you have created). Let's say 100.

An unstored calc: Display. GetValue( $$Data; N)

You build a return-delimited set of data and put it into variable $$data

In the proposed variation, you instead put EACH ROW of the data into $$data[N]

If you have first captured the ListOf summary data into a single $variable or $$variable, you will have to process it ( split it) into individual repeats.

The unstored calc field then becomes $$data[N]

You can also use actual repeating fields for this as well. This can be a global repeating field or a stored repeating field.

Link to comment
Share on other sites

OK, I can see putting the rows into $$data[N] and extracting each row based on that - then how can that data be formatted for a report, or on-screen display or exporting?

In this instance, I would like to display the data on-screen in a human-readable format to be checked and then export as a 36 column .csv.

And I've been searching for examples of the $$data[N] approach so as to not be asking these questions - but my Google-fu doesn't seem to be working for this...

Link to comment
Share on other sites

You can still build your rows with whatever delimiter method you choose; and break them into columns.

So let's say you build the rows as pipe delimited. Then you could create calc RowNSplit as substitute( $$data[N]; "|"; "¶")

Then Cell1 = getValue( RowNSplit; 1), Cell2 = getValue( RowNSplit; 2) ; etc. If you need embedded returns in a cell you'll need to modify the calc to handle that.

I don't see why any of this needs explaining though. Splitting rows into multiple fields has always been a part of the method.

 

Link to comment
Share on other sites

1 hour ago, webko said:

OK, I can see putting the rows into $$data[N] and extracting each row based on that - then how can that data be formatted for a report, or on-screen display or exporting?

In this instance, I would like to display the data on-screen in a human-readable format to be checked and then export as a 36 column .csv.

So the idea is, place the data rows into a Repeating Variable as you understand.

Next, use your table of calculations as you would for any virtual table. You will have the following fields:

  • rowData = Substitute ( $$data[N] ; "|" ; "¶" ) // This convert the text string into something that's easy to grab data from. 'N' will be a constant field ( like a serial number field is what I usually use )
  • column1 = GetValue ( rowData ; 1 ) // This grabs the first value of the row data.
  • column2 = GetValue ( rowData ; 2 ) // This grabs the 2nd value of the row data.
  • ...rinse and repeat.

Use the Virtual Table as the base for you layout for the report. Arrange and manipulate the UI as you would any layout.

  • Like 1
Link to comment
Share on other sites

Here is a very very basic example; using two storage methods for collecting the data to display.

$$data[N]

Global repeating field

Multiple fields are set up using a pipe-delimited separator.

I also have worked up a similar very simple example that lets you store collected-row report snapshots so they can be displayed later.

However, that version is not included here.

VListVariations.fmp12.zip

Link to comment
Share on other sites

This topic is 2250 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.