I know this is not perfect way. But I would like to give hints to anyone who would like to download all links in html page. Example usage is something like this. // only download "jpg" files HttpUtils.SaveFirstLevelLinksToFile("", Encoding.UTF8, "c:/temp/", link => link => ".jpg".Equals(Path.GetExtension(link.Link), StringComparison.OrdinalIgnoreCase) ); Main code is below. // main entry point method public static void SaveFirstLevelLinksToFile(string baseUri, Encoding enc, string dir, Func<LinkAttr, bool> filter) { ProcessAllExtractedLinksInHtmlText(GetPage(baseUri, enc), link => { try { if (!filter(link)) return; Uri uri = ConvertToAbsoluteURL(baseUri, link.Link); var filePath = dir + uri.AbsoluteUri.GetFileName().Replace("?", ""); uri.AbsoluteUri.GetAndSaveToFile(filePath); ...
IT関連の技術やプログラミングを中心に記事を書いています。ハードウェアも好きなので、日々のちょっとしたお役立ち情報も投稿しています。