Jump to content

ejpvi

Members
  • Content count

    148
  • Joined

  • Last visited

  • Days Won

    1

ejpvi last won the day on March 1 2015

ejpvi had the most liked content!

Community Reputation

1 Neutral

About ejpvi

  • Rank
    member
  • Birthday 04/17/1981
  1. Greetings Filemaker Gurus. When FMP 12 was first released I tried installing it on one of my workstations, and did a database conversion from 10 to 12 "locally." At the time I was not very pleased with the performance drop I saw from switching to 12. The database appeared to be running slower, and all sorts of graphical issues were making my layouts look very ugly, like buttons were no longer clicking, just would go to the page. I am not sure if performance would be better or worse once I upgraded our FMP Server 10 to Server 12, but I wasn't willing to risk it, I decided to wait a while and see if any patches were released. I guess I am looking for some feedback from other people who may have upgraded from 10 to 12. Have they released any patches that maybe resolved some of the performance issues that were plaguing the program right when it was released? Do complex scripts have to be checked to make sure they still work? I have a pretty big database, with lots of tables, fields, scripts, and many numerous layouts. Is this upgrade going to be a huge undertaking to make sure everything is still working or has it been a pretty smooth transition for most? Any feedback would be appreciated, because I am hoping to perform the upgrade in a month or so. But with so many people using my system on a daily basis, I am concerned about what kind of problems I will run into. Thank you! Edward
  2. Greetings Filemaker Gurus, I was wondering if anyone has setup a server side script that will look at a network location at a scheduled time... if a particular file exists (MyImport.xls) .. import it. (Always in the same format, always named the same thing). It would be nice if the script could validate the file is present before trying to import... and even better if it could delete the file after the script has run, so it isn't importing the same file over and over again if someone forgets to put the updated one in that location. Any ideas? Thanks!
  3. Would that plugin work on Filemaker Pro 10 Advanced Server? I couldn't find anything information about server edition on the site you sent. I had considered using an auto-Calculation, but couldn't wrap my head around it, since it would require new records to be populating another table... but then again.. with a timestamp as one of the join criteria which would always be different, would ensure unique records, but and the before and after part is a little tricky. Still trying to think of something... seems like a changelog should be pretty easy.. I just don't see how to make it available to all layouts without doing them one by one. Thanks,
  4. I am seeking some advice on how to tackle a project I am working on. there may be some flaws in my logic, but My idea made sense at the time, but I am still missing a key piece to keep everything tied together. We have PO information kept in Quickbooks that I am deriving Vendor Status reports in FMP. The information will be updated regularly from QB... but the users need to make their own notes pertaining to each line item that will remain in the historic table in FMP...when the Table of QB information is cleared and re-imported.. I am hoping I can sync it back up with the historic notes and information in the historic tables. I envision 3 tables. Table 1: Is information exported from Quickbooks Enterprise... same format, just updated information.. imported directly in to a table in Filemaker with matching field names. (The data will be deleted and re-imported whenever they want refreshed info) Table 2: will represent an Overall PO notes Table 3: Will represent individual Iine item notes and information per PO. Table 1 I was going to link by PO number to Table 2.... Table 2 I was going to auto-serialize records as line item notes were entered and as they were created in Table 3.... It gets a little confusing because I can easily keep PO's separated and in Sync, but it is possible the same Item number with different information may be present multiple times in a PO. I am trying to figure how to keep those Line Items in Sync with the QB Data...which unfortunately, I am lacking any real identifiers for, since some info will be changed in QB.. and I run the risk of it not matching back up with my historic data. I hope this makes sense. I was hoping by keeping an Indexed serial key in the Overall PO and Individual Item table, I could keep everything separate and in sync.. but it doesn't solve how my imported data will know which Line Item goes with which key. If the data were re-sorted before import.... it would throw all of the historic info off. Example: PO #1 Item 0001 Item 0002 Item 0003 PO #2 Item 0001 Item 0001 Item 0001 Item 0002 There may be individual notes and due dates for Item 0001 in PO#2... how can I keep that data in sync with my historic info.. if the only identifiers I can really match up are item # and PO # I feel I need to somehow serialize the data from QB.. but since it is pulled from QB and deleted in FMP.. I am not sure how to progress. any advice would be appreciated.. I may be going about this all wrong. thanks!
  5. Greetings, This may be answered elsewhere, but I can't find a specific response to my question. I am using a "Robot" client to automate a script I need to run nightly. I would have preferred to use Filemaker Server's Scheduled Run Scripts to do this, but everytime I run it, it gets stuck in an infinite loop or something, and the only way to get rid of it, is to reboot the server. I am performing an ODBC SQL import through the script, it should be supported.. I think the problem is the 3rd party driver that has to be initalized... it just doesn't want to work when the scheduled script runs... From Client side, it works like a charm. So after a little research, I found that people use the Robot Clients to automate scripts that use functions that won't work on the server. I have the client setup. A local Filemaker file that has a script that runs on startup. That script executes the data import script from the main database. I then have a command to exit application. I have heard you can use Windows Scheduler to run the local filemaker file at a given time. It will go through its scripts then exit. I am trying to decide the best place to house this Robot Client. I am wondering if anyone has ever used a Windows Terminal Server. For my purposes it would probably be perfect, but I am wondering if due to the nature of how Terminal Server works if I will be causing myself or any of the users any problems. Does anyone here currently use Terminal Server for something like this? Will the Robot Client still execute if I am not currently logged in as Admin on the Terminal Server? I just want as stable a solution as I can get.
  6. ejpvi

    Need some guidance

    Within the Script you write, just make sure it is pulling the data from that field. In case you have never used a script trigger... it is pretty easy. Just right click the field you want to apply it to in Layout Mode.. and choose script triggers. There are several options... I suggested OnModify since it would only trigger if someone changes the data in that field. It is about the same as created a button, except it will execute whenever someone modifies the field. The script you would need to write would be very basic as well.. Probably something like: Freeze Window set variable[$Temp; FieldName] Goto Layout[ChangesTable] New Record Set Field(Changes; $Temp) Goto Layout[Original] Where FieldName is the one you are monitoring. ChangesTable is a layout that has your New table that will keep track of the changes. As of yet, you probably just need a changes field and a Timestamp Field. Remember the TimeStamp can be autopopulated as the record is created. When creating the field in the Database Manager, click options.. and choose the necessary auto-populate timestamp on record creation. I always use a Freeze Window at the beginning so no one will notice that your script is jumping around layouts and creating records and stuff... Then either use the Script Trigger on that field, or create a button on that layout that points to that Script. Play around with it.. I just typed that up pretty quick, but hopefully will put you in the right direction. I would suggest using a local copy of your database from a backup for testing, instead of making changes in the live system until you are certain it will work how you want it to. Thanks,
  7. ejpvi

    Need some guidance

    There may be better ways to do it, but perhaps you could use an "OnModify" Script Trigger for that field. Create a new table to house the changes and Timestamp fields. The script could create a new record and populate the information into the newly created table, and have the Timestamp field auto-populate the current time whenever a record is created. You could also make a "Submit" button that executes that script, if you don't like using Script Triggers. I have done something similar for tracking changes to a field. A table with a record for each change and have a timestamp is pretty easy to report on. Is that the direction you are looking for?
  8. I am dabbling with the QODBC driver that offers ODBC support with Quickbooks Enterprise, and there is also a POS version. While not perfect, it does seem to work. The driver doesn't work with the usual ESS capability that Filemaker can utilize... but you can use a direct "SQL Query" to import records. That is what I am going to use it for. I am going to collect daily inventory and previous days sales on a daily basis, and set it up to run overnight. So far, the client side works fine, as long as the driver is installed. It is horribly slow, which is why I am opting to import into a table only small amount of data. The Server Scheduled Scripts can't seem to handle a call to the driver, and gets stuck in an infinite loop, which can only be fixed with a restart of the server. So I am going to use the Robot method of automating it. Setup a local FMP file, which has a script run on startup that executes the script in my main database which pulls the data with the driver..(has all the functionality of a client, and none of the limitations of the server side script). You can either leave it always running on that machine with an Ontime execution timer, or use the Windows Scheduler to open the FMP file at a certain time.. and at the end of your script put an "exit application" so it will close Filemaker upon completion. I am in the middle of setting all this up.. pulling the data has been easy if you are familiar with the tables and sql commands. FMP has a great Calculated Text field which can build your query based on your own calculations. This driver itself is inexpensive and has a 30 day trial, which I am utilizing for testing. If you own Quickbooks Enterprise, it should give you a free copy of the QODBC driver. I am testing the POS one, which I will have to purchase. By automating this with the robot, I should only need 1 license of it.
  9. I had a similar need the other day. I had several items in my database that have the same item number but a different version of a drop down.. blue, green, etc... So I only wanted it to count it once. Reading the article below gave me the idea on how to figure it out. Unfortunately, I am not sure how you would do it without some kind of loop to check the flag. You first must make sure you sort by the Field you are checking, otherwise this won't work. You can do this with any found set, or just show all records. I then used the function GetNthRecord() I created an unstored calculated field... Let ( [RecNum = Get(RecordNumber)]; If(GetNthRecord ( Item #; RecNum) ≠ GetNthRecord ( Item #; RecNum - 1); 1;0) ) http://www.filemakerhacks.com/?p=1798 From there you can basically run a loop that counts any record that has a "1" in the unique field calculation. Hope that helps a little. Thanks,
  10. Greetings FM Colleagues, I have recently been working on grabbing data from our Outlet's system (Quickbooks POS) for reporting purposes within FMP. It appears that QODBC now supports POS.. so I got my hands on the demo, and have had very successful results. I noticed the driver does not seem to support the typical ESS connection... but, you can script an ODBC Import via SQL Command. I have basically written two scripts. One pulls daily inventory numbers and updates a table. The other script pulls order line items for the previous day, and imports them into another table. These two scripts work perfectly if done from a client machine that has the QODBC driver installed. I installed the Driver, and Quickbooks POS on the server has houses my Filemaker Server. I then attempted to schedule these scripts to run once a day around 4 AM.... So far, I don't understand what the issue is. I have gone back into the scripts to make sure they are pointed at the correct ODBC connection... everything appears normal.. The script will start.. and run, and run, and run... never actually initialize the driver (which has a status bar icon in the tray)... not only that.. I cannot get it to terminate. The only way to halt the script is to Restart the Entire Filemaker Server Service. I did read on a post, that using the "error capture" command in the script could possibly alleviate the infinite hang, in case an error message is being produced. I am a little hesitant to try since it is hard to find a time when everyone is out of the system to restart it. But even if it fixes the hang, it doesn't explain why it isn't working. I am wondering if anyone else experienced issues with this particular driver, or scripting an OBDC import in general. Like I said, the scripts work perfectly if executed through a client, with a button.... but I really wanted to automate this... Thanks!
  11. Greetings, I apologize if I posted this in the wrong forum.. couldn't determine the best choice. I am wondering if anyone knows of any built in tools that would report Database Memory usage. Right now I have a database that is about 3GB. I have determine that about 1.3 GB are images that users have added. I plan to go through the images and shrink them to more "web size" levels to shrink the database size down. But I am still curious what is causing the mass inflation of my database. I do have lots of records, but still, I feel some users have been adding things rampantly that aren't following my size guidelines... (PDF's, excel documents, word files, images...etc) Is there anyway to do some reporting to determine these culprits without going through the database one by one and looking for it. I typically have to rely on users letting me know when something hangs on a record.. I investigate, and find someone has put a 50 MB PDF or something else into a container. Thanks! Any suggestions are welcome.
  12. I know I have done this before, but cannot seem to get this to work. I have a table with related values. I need to be able to summarize the related values into a total in a Sub-summary. The sub-summary is sorting properly, but the summary of the values only gives me the total of the first value. I have tried calling the summary from a related table. I have tried doing a calculation that calls a "getsummary" I have even tried just summing the related values. Always gives me the same result. Can anyone advise a good way to summarize numbers that are in a related table? Like I said, I am using it in a sub-summary so it would be sub-total of what I am sorting on.
  13. I am curious how people deal with Record locking with their scripts. Is there a way to kick someone out of a record? So your script can take control of it. Or do you simply test to see if the records are available, then lock them? What happens if someone enters a record, while the script is running.
  14. I did eventually get it to work. My issue was a connectivity one. Can't remember exactly what I did, but I think it had something to do with my mail settings. My email addresses were always typed in, so I haven't dealt with the calculation issue, but I think I heard other people have had similar issues. You may want to verify that their are "quotes" around your email address.. may make no difference, but I did notice the system adds them in whenever I enter mine in, maybe it doesn't when there is a calculation being used.
  15. ejpvi

    Label Help

    Thank you for sharing that, it looks along the line of what I need. I will see if I can adapt it to my system. Thanks!
×

Important Information

By using this site, you agree to our Terms of Use.