Jump to content

Dev best practices (Dev/staging/prod migrations)


This topic is 3308 days old. Please don't post here. Open a new topic instead.

Recommended Posts

How are others handling the process of moving code from Dev to Staging to Production?

Generally FMP sucks at this but SM360 makes it even more important.

How are you managing source control?

How do you migrate between systems?

Are you using off the shelf tools or custom tools for these processes?

Hoping to start a prolonged thread on how these things are managed.

What do I do?

I keep LISTS. Big monster lists of everything I need to do for a particular release when moving between environments. Then I track those changes per environment in a checkbox fashion so I can know what changes are applied. I keep the changes uniquely numbered so that I can track them uniquely per platform.

What a nightmare.

Anyone doing anything better?

John-

Link to comment
Share on other sites

  • 2 weeks later...

Hi John,

I usually use separation model. I keep relationships and auto-enter calculations in data file to minimum but I am not a purist about it. I keep track of fields I've added to Data on our test server but also back it up by running FMDiff between the two prior to update. There is little involved then in simply making a few modifications to Data and replacing the UI. And no, I never develop on live system ... I learned that lesson a long, long time ago. Every change made is added as a record in the Versions file (which I keep separate from the solution because it holds all changes a developer makes to many solutions and versions). In this way, we can quickly assess and refresh our recollections on a particular unexpected behavior.

I also use DracoVentions Developer Assistant and search for 'missing' in scripts and tables. And we could not live without Base Elements which allows us to check all questionable code, conditional formatting etc in a relational, exportable structure which is perfect for multiple-developer work as well. We even keep our notes in it.

This is a pretty thin response (and why I hadn't responded earlier) but maybe it will get the ball rolling. :laugh2:

Link to comment
Share on other sites

  • 2 weeks later...

Although it seems like everyone discourages development on a live system, the reality of it is that since FileMaker has no really good way to handle updates, it's very practical to perform most development on a live system.

For small additions/changes that have no real potential to be destructive, I make these changes on the live system. I have a separate FMServer install on a virtual system for testing out major changes. It doesn't matter if anything happens to that system. Not everyone can have this luxury, but that's the beauty of having a site license from FileMaker :yep:

A better way of doing this is to hide as much of the live development from regular users as possible. You can create all of the scripts, layouts, etc in the background, but DO NOT have any way for a user to access them until you're done testing. As a developer, you can manually run scripts and goto 'hidden' layouts that others can't and shouldn't even know they are there. The navigation is typically the last thing I build.

  • Like 1
Link to comment
Share on other sites

I've been developing on our live system for a couple years. It is convenient but inherently dangerous. Like the runnaway Delete All script i mentioned. ( By the way, always check after GTRR that you actually went. Lol ). As the system has grown more and more complex and more users get added the danger increases.

I do occasionally develop in a development copy for really big scary additions. I have scripts set up for importing all data from the old to the new file when i'm done, but that process is tedious. With 15 Gigs of data it takes many hours.

I have the opartunity this year to rebuild from the ground up a version 2. So this time I will be using the Data Seperation Model.

Link to comment
Share on other sites

Because of the nature of our business (and the relentless urgency of management), I develop almost exclusively on our live system. [more about that here]

I rarely have issues with users stumbling upon in-development layouts or performing scripts that I'm working on -- actually, I don't think I've ever had that happen. As @BrentHedden mentioned above, if you save navigation for last (really, you can do navigation whenever, just so long as you don't give 'starting line' access up front) there shouldn't be any trouble.

The big issue that I run into consistently, as described in that link, is being able to add fields without locking everyone else up. Even adding a single text field to a small database with minimal relationships causes a huge lock-up. This is especially problematic when I'm making a change that should take 5-10 minutes to execute. Sure, I could wait and make that change after hours, but what if the first opportunity for that is four days from now? It just seems silly that, in 2012, you can end up in a situation where you can't execute a simple solution on the fly. If it requires adding a field though, 5-10 minutes could easily become 30-45 minutes or more, not just for me, the developer, but for any of the users accessing the database (the database being changed or ANY database being served).

Link to comment
Share on other sites

The bigger danger is the risk of file corruption. If you should lose your network connection or your computer hangs up and you have to force quit while in layout mode my understanding is that that can corrupt your file.

If your file did become corrupt, you would have to go to an uncorrupt backup. Empty the data and import all your data from the corrupt file to the good backup. This could result in some down time that your company probably cannot afford.

Data seperation allows you to make changes to your interface file and when you are ready just upload it over your original on the server. Quick and easy since the interface file does not contain data and therefore remains small.

That being said, making the transition to data seperation is no small task, especially in a large existing system.

As far as your concerns about how you are currently working. I can relate and understand, but no matter how many ways you restate the problem, the reality is, it's not going to change. Applying changes on the fly in that way will always present those issues. The bigger the solution and the more active users, the worse it gets. :therethere:

Link to comment
Share on other sites

  • 3 months later...

I forgot to post an update, but we were able to find a solution: a bigger, badder server.

We found that a Filemaker server really relies on three things: RAM, hard disk speed, and memory. We installed a new server with 15000 RPM hard drives (four mirrored drives with the OS and databases running on C: and backing up to E:, with mirrors of each) and 24 GB memory. Almost instantly, our problems were solved. I regularly add convoluted fields and it processes quickly without anyone noticing.

Additional observations: running a back-up software other than the one built in to Filemaker can be crippling. Not only do we back-up to a disk drive other than the one our databases are running from, but we wrote a batch that moves those back-ups to an external hard drive from the network, and our CrashPlan back-up runs from that. So our back-up software doesn't interact with the server at all. We found that whenever it was running, even if it wasn't actively backing anything up, it would cause significant speed -- and eventually crash -- issues.

Link to comment
Share on other sites

I forgot to post an update, but we were able to find a solution: a bigger, badder server.

We found that a Filemaker server really relies on three things: RAM, hard disk speed, and memory. We installed a new server with 15000 RPM hard drives (four mirrored drives with the OS and databases running on C: and backing up to E:, with mirrors of each) and 24 GB memory. Almost instantly, our problems were solved. I regularly add convoluted fields and it processes quickly without anyone noticing.

Thanks for posting. :D

FMS is also sensitive to network speed, but most certainly hard disk speed (and throughput from disk to cpu) is a huge factor in its performance.

Link to comment
Share on other sites

  • 6 months later...

To throw in my .02, developing on a live server works but only up to a certain point.  When the company I work for had ~10-15 concurrent FMP users I started to notice some severe issues with re-indexing fields, people being unable to write records when this occurred  etc.  Around that time I moved from a flat, single file model to a separation model.  This method worked great as far as making changes to layouts and ERD organization was concerned but it still didn't rectify the issues with making changes to stored calcs and adding new indexed fields.

 

Now that our company has passed in excess of 50 FMP users, developing on a live file is a huge no-no and I have moved to a complete separate Live and Development system.  Small changes to the database (Layout or scripting changes) can be made on the live file and then made on the dev file, but any changes to the schema are made in development and tested throughly and then imported to cloned copies of the development database.  The import can take quite a while (last one took in excess of 3 hours) so even this method is starting to have severe drawbacks.  There are tools out there to help with the import (I originally had a custom made system by The Support Group) but have recently moved Goya's RefreshFM product.  I would estimate in excess of 90% of my data remains unchanged between updates and it works well enough for my needs but I really wish there was a reliable way to "sync" a development database and a live database with Filemaker instead of having to do cloned imports.

Link to comment
Share on other sites

You could also "modularize" the solution, and split out the data file into several data files based on function. So you'd have, for example, the contact management file, the inventory file, and the invoices file. Each with whatever tables you need.

 

That may cut down on you're importing.

 

But if you're bringing down the database for 3 hours anyway, why not close the file to outside users and make you changes after sandboxing them in the development server?

Link to comment
Share on other sites

  • 2 years later...

This discussion answered a lot of questions I've been pondering as our in-house FM solution grows.  Wondering if there have been any breakthroughs in development process since the last post to this thread (2012)?

Link to comment
Share on other sites

  • 2 weeks later...

This discussion answered a lot of questions I've been pondering as our in-house FM solution grows.  Wondering if there have been any breakthroughs in development process since the last post to this thread (2012)?

 

It is highly unlikely that there will be a "breakthrough" until some serious schema changes will occur (which I do not expect soon ;o).

Link to comment
Share on other sites

what kind of breakthrough are you looking for?  What would you want to do that these practices prevent you from doing?

 

In another thread it was clear that you did not want to work weekends or nights and that there was no agreed maintenance window to take the solution down.  That's the first thing to get in place.  It will make your life a lot easier.  And if your solutions power critical business processes, there is no getting around working late at night and in the weekends.  That's the nature of the game.

Link to comment
Share on other sites

This topic is 3308 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.