Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"I am keeping IQFeed, much better reliabilty than *******. I may refer a few other people in the office to switch as well." - Comment from Don
"You have an excellent product !!!!!!" - Comment from Arely
"If someone needs the best quality data and backfill beyond what their broker provides at a rate that is the best in the industry, I highly recommend IQFeed." - Comment from Josh via Public Forum
"I just wanted to let u know that your data feed/service is by far the best!!! Your unfiltered tick data is excellent for reading order flow and none of your competitors delivers this quality of data!" - Comment from Peter via Email
"I cannot believe what a difference it makes trading with ProphetX!" - Comment from Bruce in Los Angeles
"I was with ******* for 4 years at $230 a month, this is a huge savings for me, GOD BLESS YOU PEOPLE," - Comment from T.S. via Email
"IQFeed version 4 is a real screamer compared to anything else I have seen." - Comment from Tom
"As a past ******* customer(and not a happy one), IQ Feed by DTN is a much better and cheaper product with great customer support. I have had no problems at all since switching over." - Comment from Public Forum
"Its working FABULOUSLY for me!! Holy cow...there has been so much I've been missing lately, and with this feed and Linnsoft software...I'm in the game now." - Comment from Chris R.
"With HUGE volume on AAPL and RIMM for 2 days, everyone in a trading room was whining about freezes, crashes and lag with *******, RealTick, TS and Cyber. InvestorRT with IQFeed was rock solid. I mean SOLID!" - Comment from Public IRC Chat
Home  Search  Register  Login  Recent Posts

Information on DTN's Industries:
DTN Oil & Gas | DTN Trading | DTN Agriculture | DTN Weather
Follow DTNMarkets on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
Viewing User Profile for: codingcorners
About Contact
Joined: Aug 5, 2016 08:46 AM
Last Post: Aug 5, 2016 09:03 AM
Last Visit: Sep 16, 2016 07:29 PM
Website: https://www.shippifly.com
Location: United States
Occupation: Programmer/Developer
Interests: Java, PHP, Javascript
Email: codingcorners@gmail.com
AIM:
ICQ:
MSN IM:
Yahoo IM:
Post Statistics
codingcorners has contributed to 1 posts out of 21193 total posts (0.00%) in 2,835 days (0.00 posts per day).

20 Most recent posts:

Hi all, I'm building a Java app for a client based off of the same idea as the sample Level1Socket.java app in the developer examples. The gist of it is basically a socket listener app (command line, no GUI) that..

1. Sets the fieldset to 'Symbol','Last','Total Volume', 'High','Low','Number of Trades Today', and 'VWAP'.

2. It then loads in a CSV 'symbol loader' file that has one symbol per line (about 600 in total right now, but the plan is for there to be closer to 6,000 when at expected capacity) and runs the 'w[SYMBOL]' Watch command on each of them.

3. Next it stores the returned summary lines (either P or Q) in an ArrayList. It ignores all T, F, and n lines.

4. A separate Timer thread every three seconds will loop through this ArrayList and save the data to a MySQL database, Updating if the Symbol already has a row in there, or Inserting if not.


My question is, there's a lot of data that I don't necessarily need to come through the feed. The 'F' Fundamental lines in particular - is there any way to turn those off like there is with Timestamps and News lines?

Another question - Is this the best way to get the most current data from IQFeed in realtime? I was hoping there might be a way to do a batch poll of a bunch of Symbols, specifically looking for a certain fieldset, and to be able to re-poll that data every 2 seconds to look for change, rather than wait on the variability of a socket session to spit out data (which is providing some inconsistent results when we go to save data to our DB).

I'm coming to find that I can't get enough good data saved from the feed to do accurate data processing into my database table, and I'm guessing it's got to be because there's too much other data that the reader is dealing with to be able to keep up with the data as it's coming down the pipeline, so I'm trying to find a way to free it up and save some bandwidth.

Any help would be greatly appreciated, I've been banging my head against the wall all night on this! :)


Time: Thu May 9, 2024 6:10 AM CFBB v1.2.0 7 ms.
© AderSoftware 2002-2003