Show HN: 4B+ DNS Records Dataset

(merklemap.com)

90 points | by Eikon a year ago ago

28 comments

  • genmud a year ago

    Neat! How is this different than domaintools/farsight [1]?

    Passive DNS [2] has been in my toolbox for 15+ years, and is invaluable for security research / threat intelligence. Knowing historical resolutions to something are so helpful in investigations.

    For anyone interested, they should check out the talk by one of the DomainTools people [3] on how it can be utilized for investigation.

    Are you passively collecting this data, or actively querying for these records?

    [1] - https://www.domaintools.com/products/threat-intelligence-fee...

    [2] - https://www.circl.lu/services/passive-dns/

    [3] - https://www.youtube.com/watch?v=oXmapqLkZd0

    • lyu07282 a year ago

      is this making use of letsencrypt as well? afaik all letsencrypt signed certificates including all subdomains are immediately public, which could be useful for security research as well

      • Eikon a year ago

        It's not about letsencrypt but certificate transparency which works the same for all public CAs.

        I wrote a documentation piece here:

        https://www.merklemap.com/documentation/how-it-works

      • whalesalad a year ago

        At first glance it looks like this data is generated via the public certificate transparency log, so I would imagine the answer is yes.

    • Eikon a year ago

      From what I understand [1] is just tlds, not subdomains?

      • genmud a year ago

        That would be incorrect, they get subdomains for passive dns feeds.

        • Eikon a year ago

          Ok, it'd be interesting to know how big is their datasets compared to mine and how much they overlap.

  • g-mork a year ago

    Any possibility of adding (first seen, last seen) time stamps? There is basically no good way to reconstruct the state of e.g. SPF at a point in time from existing DNS data sets

    • Eikon a year ago

      I could in future releases, yes.

  • romperstomper a year ago

    There are quite many duplicates, looks like for CNAME records only/mostly. Here are some from the beginning

      staging.pannekoeken-poffertjes-restaurant-amstelland.nl,CNAME,www.pannekoeken-poffertjes-restaurant-amstelland.nl.
      staging.pannekoeken-poffertjes-restaurant-amstelland.nl,CNAME,www.pannekoeken-poffertjes-restaurant-amstelland.nl.
      www.domiciliatuempresa.com,CNAME,domiciliatuempresa.com.
      www.domiciliatuempresa.com,CNAME,domiciliatuempresa.com.
      *.autokozmetikakaposvar.hu,CNAME,autokozmetikakaposvar.hu.
      *.autokozmetikakaposvar.hu,CNAME,autokozmetikakaposvar.hu.
      c7ac691a.oob-nuq1907.indubitably.xyz,CNAME,oob-nuq1907.hosts.secretcdn.net.
      c7ac691a.oob-nuq1907.indubitably.xyz,CNAME,oob-nuq1907.hosts.secretcdn.net.
    
    etc
    • Eikon a year ago

      It’s because I don’t try to de duplicate and just saves whatever response I get, which translates to this obvious behavior for cnames. Shouldn’t be a big deal.

      I may improve that in future releases.

  • blex a year ago

    Is there a good tool to browse big text archives, like .csv.xz, .csv.gz, or .7z, without decompressing them?

    I don't want to decompress 29 GB into 211 GB each time I want to make a search.

    Except grep / zgrep, is there a good tool/viewer (or hex editor that can decompress parts of big files for display) for this general task?

  • ciclista a year ago

    Would love the option of torrenting the file, download seems quite slow, and hopefully it would save you some bandwidth!

    • Eikon a year ago

      I was thinking about that, I’ll experiment by adding a .torrent file :)

  • g48ywsJk6w48 a year ago

    Thank you for data set!!! It is not always lowercase, so it have some duplicates.

    Also you can avoid unnecessary data with analyze CNAME records. -- domain.tld CNAME www.domain.tld -- So you can use only domain.tld or www.domain.tld records.

  • m3047 a year ago

    I've worked in the industry at IID and Farsight. I am skeptical of many claims made by IoC vendors.

    You need timestamps, or first / last seen.

    Records don't exist in a vacuum. They come in RRsets. They are served (sometimes inconsistently) by different nameservers. Some use cases care about this.

    Records which don't resolve are also useful, especially for use cases which amount to front-running. On any given day if the wind was blowing the right direction .belkin could be one of the top 10 non-resolving TLDs. If your data is any good, check under .cisco for stuff which resolves to 127.0.53.53. ;-)

    Information about provenance (where the data comes from) is required for some use cases.

    We shipped Farsight's DNSDB on one or more 1TB drives, depending on what the customer was purchasing.

  • whalesalad a year ago

    211GB seems very small. How is this generated?

    • Eikon a year ago

      What makes you think it's small?

  • mobilio a year ago

    note - that records can be geolocation routing.

    This mean that from country A i can get records as X, but in country B records can be Y.

    Would be great if you can make new column in CSV that can show about variations - Y/N.

  • 35mm a year ago

    How often is it updated?

    Does it include expired domains?

    • Eikon a year ago

      > How often is it updated?

      I plan to do 2 releases a month for now, goal is one a day.

      > Does it include expired domains?

      Yes.

      • mh- a year ago

        This is fantastically valuable, especially if you can add the first/last-seen as requested by another commenter. Thanks for doing this.

        • Eikon a year ago

          Thanks.

          That's quite a fun project!

  • romperstomper a year ago

    How many domains in this dataset?

  • nhggfu a year ago

    great work OP.

    • Eikon a year ago

      Thank you!

  • T3RMINATED a year ago

    Where do you get the data from? Does it include subdomains?