Jump to content

How to push real-time updates to an RSS feed from FileMaker..???


angelleye

This topic is 6373 days old. Please don't post here. Open a new topic instead.

Recommended Posts

I'm trying to develop some web applications using FileMaker as the back-end. I've tried both ODBC and CWP XML API FileMaker provides, however, both of them are too CPU intensive. A single query chews up 25% - 35% of the CPU for a second making it unusable in a live web site.

As such, I'm trying to come up with a different solution that will allow me to get real-time data from FileMaker to display on the product pages w/o having to query FM Server / Web Publishing Engine every single time.

The only thing I've been able to come up with is to generate an RSS feed (or any static XML file really) that would act as my data source for pulling data. I could simply parse it the same way I parse the CWP responses.

The problem here becomes the fact that this file would not be real-time unless I can somehow push updates to it in real-time. This way the file would simply be getting updated instead of needing a full export from FM and would be in sync with FM.

I use a bit-torrent client/server, Azureus, that seems to do exactly what I'm trying to do. When you create a new share in Azureus and you've got your web access turned on it will create an RSS feed for your server. Anytime you add new shares to your server the RSS automatically gets updated and is immediately available. If you remove the share it's immediately removed from the RSS feed.

Any information on whether or not this plan sounds feasible, and any guides I can get on how to accomplish the "push" update of the XML file from FM would be greatly appreciated. I can handle the creation of my RSS feed but I don't know how to push changes to it.

Thanks!!!

Link to comment
Share on other sites

If you're talking about exporting static RSS data files to the web folder of your server, you can do that with FileMaker XML Export, with an XSL stylesheet. RSS is pretty simple format. But I don't know how to get FileMaker Server to do this, to its local or remote "web" folder (wherever that may be); since Export is a client machine operation. I imagine it can be done with custom web publishing and PHP.

An alternative "for dummies" such as myself would be to export to a local file(s), then upload those to the web site, via ftp. I could do this on a Mac, using the built-in AppleScript or command line tools. I'm certain the same functionality could be done on Windows (via "wget" or a FileMaker plug-in, there are several).

This is obviously a fairly static updating of the web data. But it could be scheduled. Here's an example of what the RSS xsl would look like, for a local export, FMPXMLRESULT. A CWP export would possibly use a different grammar:

<?xml version='1.0' encoding='utf-8'?>



xmlns:fmp="http://www.filemaker.com/fmpxmlresult" exclude-result-prefixes="fmp">





	

		

    	

    		

				

				

				

				en-us

				

				

			

			

			

				

					

					

					

					

					

							

			



			

		

	

Link to comment
Share on other sites

Actually, I'll be generating the RSS feed (or just a static XML probably) using ASP so I'll be able to put the result file anywhere I please w/o having to worry about uploading or anything like that. That's not the issue.

I was hoping to find a way that I could send UPDATES only to my XML file instead of having to re-generate the entire thing and replacing the current one with the new.

Ideally, I'd like to figure out a way that FM server would automatically run an update script anytime the data in that table was changed.

Link to comment
Share on other sites

AMD Opteron 254 2.81GHz w/ 2GB RAM. I saw somebody else that mentioned they're running dual Xeon's though, and 4GB RAM and they have the same problems. Our hard disk is a SCSI 15k rpm.

Typical records returned are anywhere from 20 to 100 or so. They return nice and quick, but each query makes the CPU spike so with only 3 of us playing around on it the CPU was steady around 85%. Won't work.

Link to comment
Share on other sites

Any luck on this? I was looking to start pulling RSS feeds on daily events occurring per user, events per user would be less than 50 per day, but I would like each user to have their own feed, and then a daily feed that encompasses all users. I was going to start jumping into this today, but ran across your post and am concerned with the performance issues you are having. If there is anything I can do to help let me know.

Link to comment
Share on other sites

No, I haven't much luck getting this figured out. The CWP API is simply too CPU intensive to work in a live web environment. Where I stand now is that I've got a script created that generates a static XML file consisting of all our product data. I'm then using this XML dataset as my primary source instead of hitting FileMaker's Web Publishing Engine every time, which seems to be the cause of the CPU spikes.

I've automated the generation of a new static XML that replaces the existing one each morning so I won't be real-time. Rather upsetting, but it's the best I can do right now.

If you can give me any tips on sending real-time updates to my XML instead of re-generating the entire thing that'd be cool.

Link to comment
Share on other sites

Hmm. I was the one originally complainining about performance issues (since FMSA 7), but also had reasons that were different (e.g. performance issues with large answer sets, e.g. with more than 50-100 records).

RSS is pull technology, not push. As long as we ask for HTTP transport it is the client that requests data, not the server that pushes data. This means that the server has to know the present state of the data.

Also there is no "real-time" with HTTP, assumed that you have accesses from an outside network that is not controlled by you.

We have about 300K web requests per week, 20% of these are requests to FMI. The CPU load is reasonable, as you may figure out from our screenshots (requests, CPU load) of a normal work day.

There are also RSS feeds among these requests (actually, about 3000 RSS queries/week), and users can choose among 2000 RSS feeds that we offer through FMI.

To keep the bandwidth low, the trick essentially is first to query the database if the data was updated (can be easily done with a small set of retrieved data only, e.g. with a small -max value, to keep server load low), and if not, to send back a HTTP 304 status code ("Not Modified") and the Last-modified date with fmxslt:set_header(). This just sends 38 bytes. You need also to determine the If-Modified value sent by the newsreader.

Performance always depends on how much data one processes. E.g. our homepage has a Server-Side Include that fetches data from FMI each time it is accessed; the whole page is returned within 0.3 sec; only 1-4 lines of text plus link information are returned.

Maybe another option to think about would be to use AJAX and cache some data client-side. We are experimenting at present with a type-ahead prototype fetching data quickly from the server, and it does reasonably well, although server load goes up permanently to 20%.

CPU_Usage.jpg

Requests.jpg

Link to comment
Share on other sites

  • 1 month later...

This topic is 6373 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.