利用HttpClient和Jsoup库实现简单的Java爬虫程序
HttpClient简介
HttpClient是Apache Jakarta Common下的子项目,可以用来提供高效的、最新的、功能丰富的支持HTTP协议的客户端编程工具包,并且它支持 HTTP 协议最新的版本。它的主要功能有:
- (1) 实现了所有 HTTP 的方法(GET,POST,PUT,HEAD 等)
- (2) 支持自动转向
- (3) 支持 HTTPS 协议
- (4) 支持代理服务器等
Jsoup简介
jsoup是一款Java的HTML解析器,可直接解析某个URL地址、HTML文本内容。它提供了一套非常省力的API,可通过DOM,CSS以及类似于jQuery的操作方法来取出和操作数据。它的主要功能有:
- (1) 从一个URL,文件或字符串中解析HTML;
- (2) 使用DOM或CSS选择器来查找、取出数据;
- (3) 可操作HTML元素、属性、文本;
使用步骤
maven项目添加依赖
pom.xml文件依赖如下:
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.8.3</version>
</dependency>
编写Junit测试代码
代码
import org.apache.http.HttpEntity;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.protocol.HttpClientContext;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.util.EntityUtils;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.nodes.Element;
import org.junit.Test;
import java.util.List;
/**
* HttpClient & Jsoup libruary test class
*
* Created by xuyh at 2017/11/6 15:28.
*/
public class HttpClientJsoupTest {
@Test
public void test() {
//通过httpClient获取网页响应,将返回的响应解析为纯文本
HttpGet httpGet = new HttpGet("http://sports.sina.com.cn/");
httpGet.setConfig(RequestConfig.custom().setSocketTimeout(30000).setConnectTimeout(30000).build());
CloseableHttpClient httpClient = null;
CloseableHttpResponse response = null;
String responseStr = "";
try {
httpClient = HttpClientBuilder.create().build();
HttpClientContext context = HttpClientContext.create();
response = httpClient.execute(httpGet, context);
int state = response.getStatusLine().getStatusCode();
if (state != 200)
responseStr = "";
HttpEntity entity = response.getEntity();
if (entity != null)
responseStr = EntityUtils.toString(entity, "utf-8");
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (response != null)
response.close();
if (httpClient != null)
httpClient.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
if (responseStr == null)
return;
//将解析到的纯文本用Jsoup工具转换成Document文档并进行操作
Document document = Jsoup.parse(responseStr);
List<Element> elements = document.getElementsByAttributeValue("class", "phdnews_txt fr").first()
.getElementsByAttributeValue("class", "phdnews_hdline");
elements.forEach(element -> {
for (Element e : element.getElementsByTag("a")) {
System.out.println(e.attr("href"));
System.out.println(e.text());
}
});
}
}
详解
- 新建HttpGet对象,对象将从 http://sports.sina.com.cn/ 这个URL地址获取GET响应。并设置socket超时时间和连接超时时间分别为30000ms。
HttpGet httpGet = new HttpGet("http://sports.sina.com.cn/");
httpGet.setConfig(RequestConfig.custom().setSocketTimeout(30000).setConnectTimeout(30000).build());
- 通过HttpClientBuilder新建一个CloseableHttpClient对象,并执行上面的HttpGet规定的请求,将响应放在新建的HttpClientContext对象中。最后从HttpClientContext对象中获取响应的文本格式。
CloseableHttpClient httpClient = null;
CloseableHttpResponse response = null;
String responseStr = "";
try {
httpClient = HttpClientBuilder.create().build();
HttpClientContext context = HttpClientContext.create();
response = httpClient.execute(httpGet, context);