Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!
(Move your cursor to this area to pause scrolling)
"Interactive Brokers tick data was inconsistent, so I have switched to using DTN exclusively. It is great to no longer have to worry about my datafeed all day long." - Comment from Philippe
"Thank God for your Data Feed as the only Zippers I see are on my pants (LOL), and no more 200 pip spikes to mess up charts." - Comment from Spiro via Email
"I will tell others who want to go into trading that DTN ProphetX is an invaluable tool, I don't think anyone can trade without it..." - Comment from Luther
"DTN feed was the only feed that consistently matched Bloomberg feed for BID/ASK data verification work these past years......DTN feed is a must for my supply & demand based trading using Cumulative Delta" - Comment from Public Forum Post
"I used to have *******, but they are way more money for the same thing. I have had no probs with data from DTN since switching over." - Comment from Public Forum Post
"I've been using IQFeed 4 in a multi-threaded situation for the last week or two on 2600 symbols or so with 100 simultaneous daily charts, and I have had 100% responsiveness." - Comment from Scott
"There is no doubt that IQFeed is the best data provider. I am very satisfied with your services. And IQFeed is the only one that I would recommend to my friends. Now, most of them are using your product in China." - Comment from Zhezhe
"My broker in Davenport suggested I give you a try as he uses your service and says its the best." - Comment from Bill via RT Chat
"I just wanted to say how happy I am with your service. I was able to download the API docs last week and I was able to replicate Interactive Brokers historical bar queries and realtime bar queries over the weekend. That was about one of the fastest integrations that I've ever done and it works perfectly!!!!" - Comment from Jason via Email
"Everything is working great with the API. I love it." - Comment from Calvin
|Apr 2, 2013 08:04 PM
|Sep 20, 2013 10:09 AM
|Sep 20, 2013 10:09 AM
gamozo has contributed to 4 posts out of 21161 total posts
(0.02%) in 3,983 days (0.00 posts per day).
20 Most recent posts:
Just tick-level on pretty much all the top 20 most traded futures. I think @ES# has about 55 million datapoints at tick level.
I pull at ticklevel as I postprocess all the data and frequently use tick data for simulated fills rather than n-length bars.
It's not a huge issue, I usually just pull once and then have the files around forever. I'm just requesting the feature if it's easy to add as it would be nice here and there.
Yep. That's what I currently do. However that only works with multiple symbols. It would be nice to have one symbol at full speed. And the only way I could think of doing this is if I knew how many ticks there were historically, with that I could start multiple queries at different starting points. I guess I could do it in chunks of 100k trades or so and just guess for now, but it'd be nice to know how many trades to expect.
Is there any chance we could get multi-threaded decompression in IQConnect? I'm bottlenecking on CPU rather than download bandwidth.
Currently I just pull all my symbols at once, which means I bottleneck on bandwidth which makes me happy. However I would like to pull a single symbol at full network bandwidth.
Another thing that could allow me to implement this on my end is if you add a 'get number of ticks', so I could query how many ticks of history there is for a symbol. Once I get that number I could start at trades/threads offsets and do a multi-threaded pull.
Any chance this could happen soon?
Edited by gamozo on Sep 19, 2013 at 05:22 PM
I really do not like working with text-based protocols, as they're clumsy, expensive to parse, and take longer to write parsers for.
I'm wondering if there could be a mode added where data will be sent in binary fixed-width structures rather than CSVs. It pains my love for optimization to know that you guys take in binary data, convert it to CSV, and I convert it right back the second I read it.
I know it's probably something that most people do not ask for, but I can't imagine it would take too long for you guys to implement it.
Edited by gamozo on Apr 2, 2013 at 08:20 PM