Jump to content
Server Maintenance This Week. ×

Splitting a FileMaker database for separation model


Recommended Posts

Hello everyone,

I'm doing something I know I've done a number of times before, but for some reason, now I'm struggling with it.  I am a little rusty with FM, so it's possible I'm doing something silly...

I have my development done; it works great. My client is a few hundred miles away, so I'd like to separate data and interface to make changes whenever requested and simply ship him the new front end. I've done this before on much bigger projects; I've never had a problem.

Today, as I'm splitting and redirecting tables from the front-end file to the back-end file, all of my calculations are breaking.  Before the redirection, I'd have calculations that look like this:

spacer.png

After the redirection, the same calculations no longer work, and I have this:
spacer.png

I don't ever remember having this problem before, but as I said, there's some rust on my brain.  Can anyone clue me in on what's going sideways here?  And I'll apologize in advance if it's a rookie error.

Thanks very much for your help

Brent
 

Link to comment
Share on other sites

4 minutes ago, Søren Dyhr said:

Watch this and you'll probably discover where you have forgotten something:

 

Yes, I actually watched that video a few days ago as a refresher (or to see if there was a new way of doing it.

Link to comment
Share on other sites

Keep in mind, this was me doing a subset of my 100ish tables, committing the change, then looking at the front end.  Now I'm trying completing them all before looking -- perhaps a partial job won't make everything hook up...

Link to comment
Share on other sites

5 hours ago, BrentBollmeier said:

Keep in mind, this was me doing a subset of my 100ish tables

Tabel Occurences or genuine tables? I have never encountered the later, could something ...have gone wrong, during development and spec'ing the desires and behaviours of each table - since the urge to keep unstored calc's pertain ... the objectives are as I recall it, to keep data and the massage of the data separate?

What I mean is, have you done something to the structure, which prevents "Lobotomi" ... not that Todd, in the video says some but not necessarily all!

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

You're saying that the calculation fields should be in the front-end as opposed to the back-end?  Yes, I do see your point.  However, as I mentioned, I've done this many times before and never had to go to those lengths to split the database.

Link to comment
Share on other sites

3 hours ago, BrentBollmeier said:

You're saying that the calculation fields should be in the front-end as opposed to the back-end? 

No what I'm saying is that the unstored fields should be kept, at the bare minimum, and instead be substituted with event triggers only firering scripts when it's absolutely nessersary. 

But the ... in my book large number of tables, gives away ... a less than ideal normalization of the structure.

Perhaps you need to remove each old connection, evertytime you redirect, instead of doing them all at once, after redirecting. The place where the "occurrence in graph" should be completely empty ... when the deed is done. 

--sd

Link to comment
Share on other sites

2 hours ago, Søren Dyhr said:

No what I'm saying is that the unstored fields should be kept, at the bare minimum, and instead be substituted with event triggers only firering scripts when it's absolutely nessersary. 

But the ... in my book large number of tables, gives away ... a less than ideal normalization of the structure.

Perhaps you need to remove each old connection, evertytime you redirect, instead of doing them all at once, after redirecting. The place where the "occurrence in graph" should be completely empty ... when the deed is done. 

--sd

100 instances of tables in my relationship graph.  Only 30 actual tables.  Also, I tried doing them a few at a time with the same result.

Link to comment
Share on other sites

17 minutes ago, BrentBollmeier said:

100 instances of tables in my relationship graph.  Only 30 actual tables.  Also, I tried doing them a few at a time with the same result.

Strange, there must be something you scate over with less meticulosity - do you clone the file first? 

--sd

Link to comment
Share on other sites

Posted (edited)

I figured it out... and it was so embarrassing I don't want to admit to it in public... 😅

Thanks for your help - I genuinely appreciate it!

Brent

Edited by BrentBollmeier
Link to comment
Share on other sites

6 hours ago, BrentBollmeier said:

Only 30 actual tables. 

Glad you discovered where the weekness was ... but I'm still a little puzzled with the relatively high tabels count the solution here consists off? Perhaps is just my imagination which aren't stretching far enough? Are you making subgrupings of entities, if yes why? An example being 

There is in the few lines you have reveals to us where more than one calculation fields containing exactly same calc - why is that?

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

Posted (edited)
47 minutes ago, Søren Dyhr said:

Glad you discovered where the weekness was ... but I'm still a little puzzled with the relatively high tabels count the solution here consists off? Perhaps is just my imagination which aren't stretching far enough? Are you making subgrupings of entities, if yes why? An example being 

There is in the few lines you have reveals to us where more than one calculation fields containing exactly same calc - why is that?

--sd

It's a bit of a complex app that tracks a complex process, each step with many parameters.  And I only have the multiples of the same calc for debugging/dev purposes - I hadn't cleaned them up yet, that's all.

This is my relationship graph.

Screenshot 2024-04-20 103059.png

Edited by BrentBollmeier
Link to comment
Share on other sites

7 hours ago, BrentBollmeier said:

I figured it out... and it was so embarrassing I don't want to admit to it in public... 😅

That's too bad. Because it could help someone else struggling with the same problem.

Now we have a thread of considerable length and of absolutely no use to anyone.

 

Edited by comment
Link to comment
Share on other sites

3 minutes ago, comment said:

That's too bad. Because it could help someone else struggling with the same problem.

Now we have a thread of considerable length and of absolutely no use to anyone.

 

Fair point. 

After splitting the database, I was looking at the front-end file with the tables still in it, and not the back-end file. 

Link to comment
Share on other sites

1 hour ago, BrentBollmeier said:

And I only have the multiples of the same calc for debugging/dev purposes - I hadn't cleaned them up yet, that's all.

I seems to remember that, BOM's (Bill of materials) could benefit from a recursive self-join structure ... in order to facilitate sub-parts of sub-parts?

--sd

Link to comment
Share on other sites

7 minutes ago, Søren Dyhr said:

I seems to remember that, BOM's (Bill of materials) could benefit from a recursive self-join structure ... in order to facilitate sub-parts of sub-parts?

--sd

Good idea.  Thanks!

Link to comment
Share on other sites

With the availability of the data migration tool and products like OttoFMS to automate updates to production files, I have to question the need to separate the ui from the data. 
 

disclaimer. I work for Proof+Geist. OttoFMS is free. We don’t separate solutions as an answer to updating solutions. We might separate for concerns. 

Link to comment
Share on other sites

26 minutes ago, bcooney said:

With the availability of the data migration tool and products like OttoFMS to automate updates to production files, I have to question the need to separate the ui from the data. 
 

disclaimer. I work for Proof+Geist. OttoFMS is free. We don’t separate solutions as an answer to updating solutions. We might separate for concerns. 

I wasn't familiar with OttoFMS, but I'll look into it.

I've just always been more comfortable splitting things up to drop in UI changes without worrying about the data.

Link to comment
Share on other sites

19 hours ago, BrentBollmeier said:

It's a bit of a complex app that tracks a complex process

It's a tough call to decide, between Normailization or sub typing:

 

Link to comment
Share on other sites

17 hours ago, bcooney said:

I have to question the need to separate the ui from the data. 

It would however be a nice feature, to have like in Visualstudio under compilation, where the compiler hover over the solutions implications and have it substitute out for better code, than the matters the developer originally have entered. An example being in filemaker, the excessive use of unstored calc-fields visible on layouts, pasted in there merely because you can get away with it in an spreadsheet'ish realm. Couldn't some AI make the comments as to why certain lines have been replaced with better ones? 

In one of my holidays, did I encounter a landlord in a B&B, having a sideline of assembler pattern recognition, where he suggested faster changes to the code, and actually made a living that way ... something AI would master with no sweat broken?

--sd

Edited by Søren Dyhr
Link to comment
Share on other sites

20 hours ago, BrentBollmeier said:

I wasn't familiar with OttoFMS, but I'll look into it.

I've just always been more comfortable splitting things up to drop in UI changes without worrying about the data.

Does that truly pan out? Even when data separation was done to avoid error prone imports, I often found the need to add fields to the data file. 
 

Soren, transactional scripting is our approach for denormalization, which is often required for performance. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.