then old files that were in a previous build can be in the zip file of a new
build that doesn't have those files anymore. When the zip command sees an
existing archive, it modifies the archive rather than rebuilding it from
scratch.
"These five variables are described in <versioninfo>" because the reference was
to a list item, not to its enclosing section. I changed it so it becomes 'These
five variables are the same as those described under <versioninfo> in the
section called "match Directive"'.
chars, bytes with value above 0x7F were being sign-extended within a three-byte
buffer to become FFF. This made output like
[0000] 16 03 00 00 53 01 00 00 4F 03 00 3F 47 FFFFFFFFF ....S... O..?G...
[0010] 2C FFFFFFFFF60 7E FFF00 FFFFFF7B FFFFFFFFFFFF77 ,...`~.. ..{....w
[0020] FFFFFFFFFFFF3C 3D FFF6F FFF10 6E 00 00 28 00 16 ....<=.o ..n..(..
Fixed, it looks like
[0000] 16 03 00 00 53 01 00 00 4F 03 00 3F 47 D7 F7 BA ....S... O..?G...
[0010] 2C EE EA B2 60 7E F3 00 FD 82 7B B9 D5 96 C8 77 ,...`~.. ..{....w
[0020] 9B E6 C4 DB 3C 3D DB 6F EF 10 6E 00 00 28 00 16 ....<=.o ..n..(..
were intended to be. We are okay to print if
1. We don't have a completion time estimate yet; or
2. We have passed the last completion time estimate; or
3. The estimated time remaining differs from the last one printed by more
than 3 minutes, and the difference accounts for more than 5% of the
estimated total time.
The problem was that the last printed time remaining was calculated not as
difftime(last_est.tv_sec, last_print.tv_sec), but as
difftime(last_est.tv_sec, now->tv_sec). In other words it was constantly
changing, and at the same rate as the estimated time left (if the scan was
progressing at a constant rate). That means that as soon as a completion time
estimate was fairly accurate, you would not get any more estimates because the
difference in the two times would always be small.
The test was (last_print.tv_sec < 0), which is never true. I changed it to
last_print.tv_sec == 0, which checks if the last_print structure has been given
a value yet. This little bug appears not to have mattered much, because in the
else branch of the if, other calculations with an uninitialized last_est struct
seem to have resulted in a true value anyway.
avoids an infinite recursion bug present in the old decoder. I raised the
number of compression pointers that dns.decStr will follow from 1 to 3 because
I found a server that sent 2.
it started by allocating six times the size of the input string because in the
worst case each byte can take up to six bytes when escaped (&#xXX;). It was
wasteful of time because it built the string up with strncat, which pads the
entire destination buffer with null bytes every time it was called. This led to
quadratic time complexity, not linear as expected.
The new version uses the usual strategy of doubling the size of the buffer
whenever it runs out of space. It builds up the string using memcpy, checking
each time that there is space for the new copy.
domain names were changed to '.', probably as a result of some code that wasn't
updated when surrounding code was. This changed the name net360.example.com to
net36..example.com.
garbage output and could crash Zenmap by including 0x0C bytes in XML
files. The Zenmap crash looked like
SAXParseException: .../zenmap-XXXXXX.xml:39:290: not well-formed (invalid token)