Jump to content
Server Maintenance This Week. ×

Developing while served?


This topic is 3343 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Hello forum,

 

We all know we should not make schema changes to a file while Users are in the system and how, even opening the options on a field can interfere with User experience and even cause auto-enter serial to fail, new record creation to fail etc.  We also know that if a served file crashes (or even a file opened locally), that it should be replaced with a (pristine) backup version rather than the potential risk of file damage.  What is not clear to me, and other developers I've talked to, is whether it is safe to design over WAN.  

 

In my case, our connection is very slow and drops quite often.  I have always preferred having the file on my desktop where I can keep on eye on it and I will revert to backup (ran every 30 minutes) if there is even a blip.  In fact, if I leave the computer, I close the file first.  But I have also been told that working WAN is very safe and preferred because a connection-drop protects the served file.

 

If a file is served, residing in a development folder and not accessed by Users, and if the connection drops while accessing it remotely and I am making schema changes, might it damage the file?  I realize working remotely is necessary at times but my real question is ... if we have the choice, which is safest ... working locally, LAN or WAN?

Link to comment
Share on other sites

But I have also been told that working WAN is very safe and preferred because a connection-drop protects the served file.

 

Working on a hosted file has benefits, like the ability to have automated and frequent backups and accommodating multiple developers at once in the same solution.

 

But WAN is different than LAN.  I would not consider WAN development safe.  LAN can be with a nice robust connection.  You indicated that your WAN connection is not robust so that pretty much rules it out.

Unless you can remote into a desktop on the remote network and do the development there.

 

Developing on a local file is considered much safer but it does carry its own risk; such as that a FMP crash can kill the file whereas an FMP crash when the file is hosted is gentler to the file (depending on what you were doing at the time).

  • Like 1
Link to comment
Share on other sites

Hey Wim!

 

When you say ...

 

Developing on a local file is considered much safer but it does carry its own risk; such as that a FMP crash can kill the file whereas an FMP crash when the file is hosted is gentler to the file (depending on what you were doing at the time).

 

... how would we know for sure if 'what we were doing at the time' would be gentler?  IOW, if it goes down at all, isn't it safer to assume it is toast since we cannot be guaranteed it is fine?  An FMP crash when served may be gentler but the actions should be the same ... replace the file.  Correct?

 

Thank you so much for adding depth to our understanding of the issues involved.  My only concern about working on remote files is that I've worked WAN and the files were with a hosting service.  I lost connection and found out that the hosting service just restarted the file.  YIKES.  If the file is in my hands, I KNOW exactly what is happening to it at all times; if hosted elsewhere, I do not always know.

 

Much appreciated! 

Link to comment
Share on other sites

 

 

... how would we know for sure if 'what we were doing at the time' would be gentler? 

 

If you were committing schema changes then it would not be safe.  If you just had the file open but not doing anything, or just running a script then an FMP crash on a hosted file would not toast the file.  A local file would be toast in the same scenario.

 

But when in doubt; always err on the side of caution of course.  There are no absolutes here so the extra 10 minutes to restore from a backup is always the safest thing to do.

  • Like 1
Link to comment
Share on other sites

  • 4 weeks later...

I try to avoid hijacking threads whenver possible, but I believe my question relates to this disucssion well. If not - just say the word, and I'll start a new thread. 

 

For years now, I have been doing all my updates, changes, improvements and fixes on live, server based files. We have a fairly small user base (18 total, only 10 of them active users), and I have yet to experience any problems with this approach.  However, over the last 2 years I have begun to greatly expand the functionality of our system, have added FMGo users,  and we are likely going to add a number of desktop users in the near future.  We also plan to create a citizen portal via CWP/PHP. 

 

I am getting more and more uncomfortable with working on a live system, and I'm trying to find the correct way to handle a constantly changing system. 

 

Our db system is in constant use 7-5, 5 days a week.  I do my FM work during the same time.  I'd really like to NOT work nights and weekends - and taking portions of the system off line for any lenght of time is not a great option.  It seems I may be stuck between the rock and the hard place.

 

How do you work with a Filemaker system that is in constant use, but is still in need of updates, improvements, and tweaks? 

 

 

 

 

Link to comment
Share on other sites

 

 

How do you work with a Filemaker system that is in constant use, but is still in need of updates, improvements, and tweaks? 

 

To be blunt: you don't.

 

Any system must have  a maintenance window, that is agreed with the business users.  That's when you kick everyone off and apply the changes.  Or replace it with a new copy and import data.

For everyone in IT that often means nights and weekends.  That's just the nature of the game.

  • Like 1
Link to comment
Share on other sites

As an end user that's my experience - I'll be told either:

  1. on XXX date the IT service will be off-line for updates.
  2. updates were installed over night.

It's not always ideal, but I understand that needs must...

Link to comment
Share on other sites

You design on a development system or stand alone and script a way to migrate which is something you should have in place anyway in case of corruption.  Then when time comes, you close the files, migrate locally so it is fast, and reserve.  It happens all the time.  Even major banks have regular down times; such is life.

 

The real key is designing in another file (or hopefully you are using separation model) and then you switch the UI, modify a few fields if needed in the data file and serve it back to Users.  It CAN be done - it must be planned mind you ... but it works very well and can happen quickly and safely if the plan is coordinated and documented. :-)

Link to comment
Share on other sites

You design on a development system or stand alone and script a way to migrate which is something you should have in place anyway in case of corruption.  Then when time comes, you close the files, migrate locally so it is fast, and reserve.  It happens all the time.  Even major banks have regular down times; such is life.

 

The real key is designing in another file (or hopefully you are using separation model) and then you switch the UI, modify a few fields if needed in the data file and serve it back to Users.  It CAN be done - it must be planned mind you ... but it works very well and can happen quickly and safely if the plan is coordinated and documented. :-)

 

Some excellent input on all above, and I'm learning very quickly how I need to change my ways.  And learn.  A lot.

 

When you say migrate - are you referring to bringing the data into your updated system, or migrating the changes into the existing system? 

Link to comment
Share on other sites

When you say migrate - are you referring to bringing the data into your updated system, or migrating the changes into the existing system? 

 

 

Hi Darren,

 

Migration involves data.  Even if you use separation of UI and data (two files), you should still have scripted process for disaster recovery where data from all tables can be exported and imported into new empty clone.  If you experience a crash, export as MER or CSV - which will clean the data.  If you establish this export/import script, be sure to update the maps as needed if you add tables or fields.  Then if your file crashes, you will be able to quickly migrate data from old file to new.  

 

But if you use separation and you are simply moving the solution to a new UI, then no data migration is needed because you can use the same served data file (closing it first of course).  Sometimes you may need to make a few changes to the fields or add an occasional field but, if designed well, this is *minor.

 

* it is important to make the changes to your existing data file (if any) in same order that you made the changes in YOUR development copy of the data file.  

Link to comment
Share on other sites

This topic is 3343 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.