Forums / Support / BB is not showing all headers

BB is not showing all headers

Hi. I've Been using BB for years. Today I noticed that not all headers are showing up. For example, on one newsgroup (alt.binaries.audiobooks) it downloaded over 200K headers but it shows a maximum 14280. I have a "*" (without the quotes) in the search. In the Filter field it has "Make Money Fast!". Under Header Filters, everything is unchecked.

Any suggestions??
 

BinaryBoy's reply to jerry #4141 @

Are you viewing the first parts only? To show all headers, right-click the results pane, click List Options... and click Show all parts of files. To change it back, right-click the results pane, click List Options... and click Show only first part of files.
 

TooCrooked's reply to BinaryBoy #4143 @

nah, i think im having the same issue that this guy is.

when i start searching a group for headers, i click the log tab. it says "downloading subjects..", "connected to..." "200..." "480..." and then requesting...

However, it (for me) is grabbing a fraction of the headers that are available. I see 800k headers, but it grabbed only 591. How do i purge (or force) BB to ensure it grabs every header when i have the starting point set to ALL?
 

BinaryBoy's reply to TooCrooked #4144 @

However, it (for me) is grabbing a fraction of the headers that are available. I see 800k headers, but it grabbed only 591.


Is it saying "Requesting 800000 headers" on the Log tab or does it say "Rcving 800000" on the Search tab? The number on the Log tab is simply the highest number minus the lowest number. There can be huge gaps within that range. The number on the Search tab is the actual number downloaded.

Please go to the cache folder (You can find it in the registry at HKEY_CURRENT_USER\Software\Hochsw\BinaryBoy\1.00\CacheFolderT) and let me know the size of the server!!alt.binaries.audiobooks.txt file is. A header in that group is about 50 to 100 bytes.

How do i purge (or force) BB to ensure it grabs every header when i have the starting point set to ALL?


If you have all the header filter options disabled and you don't have any filters (group, group list, global in the settings and the filter on the Search tab) blocking the headers and you're searching for * and all search strings are enabled on the Search Strings tab in the settings and your starting point is ALL, and you're not running out of memory, it should return them all. Either something is set wrong, there's a bug or there's just not that many headers.

You can reset individual newsgroups by deleting the appropriate .txt file in the cache folder and editing the binboy.bbct file but that will only cause the headers to be downloaded again. It doesn't affect how many are searched or how many are searched.
 

BinaryBoy's reply to BinaryBoy #4145 @

P.S. Which Usenet service are you using?
 

TooCrooked's reply to BinaryBoy #4146 @

giganews.. working on generating the cache for audiobooks
 

BinaryBoy's reply to TooCrooked #4148 @

Ok. Comcast's news servers are the same as Giganews so I'll download the headers from that newsgroup to try to reproduce the problem.
 

BinaryBoy's reply to BinaryBoy #4149 @

Here's what I get from alt.binaries.audiobooks on newsgroups.comcast.net (which shows "200 News.GigaNews.Com" as the greeting).

Requesting 343371 headers (438964-782334) from alt.binaries.audiobooks

A search for * returns 343370 articles (show all) or 13084 files (show only first part). Are you seeing anything significantly different from this?

When I telnet to the server directly and enter the newsgroup, I receive the following line:

211 343371 438964 782334 alt.binaries.audiobooks

211 means there was no error. 343371 is the highest minus the lowest article number. 438964 is the first article number and 782334 is the highest article number. This all seems to match what Binary Boy is showing.
 

TooCrooked's reply to BinaryBoy #4150 @

i had the same request for headers, with only 1 more being reported as kept on the servers. i will see if deleting the cache helps to get files from groups that are reporting improperly. my cache is 16.1 mb