Java 使用akkaactor的文件操作
使用Akka Actor比使用普通文件操作方法有什么优势?。我试图计算分析日志文件所需的时间。操作是查找登录超过50次的IP地址并显示它们。与Akka Actor模型相比,普通文件操作速度更快。为什么会这样 使用普通文件操作Java 使用akkaactor的文件操作,java,akka,actor,Java,Akka,Actor,使用Akka Actor比使用普通文件操作方法有什么优势?。我试图计算分析日志文件所需的时间。操作是查找登录超过50次的IP地址并显示它们。与Akka Actor模型相比,普通文件操作速度更快。为什么会这样 使用普通文件操作 public static void main(String[] args) { // TODO Auto-generated method stub //long startTime = System.currentTimeMillis()
public static void main(String[] args) {
// TODO Auto-generated method stub
//long startTime = System.currentTimeMillis();
File file = new File("log.txt");
Map<String, Long> ipMap = new HashMap<>();
try {
FileReader fr = new FileReader(file);
BufferedReader br = new BufferedReader(fr);
String line = br.readLine();
while(line!=null) {
int idx = line.indexOf('-');
String ipAddress = line.substring(0, idx).trim();
long count = ipMap.getOrDefault(ipAddress, 0L);
ipMap.put(ipAddress, ++count);
line = br.readLine();
}
System.out.println("================================");
System.out.println("||\tCount\t||\t\tIP");
System.out.println("================================");
fr.close();
br.close();
Map<String, Long> result = new HashMap<>();
// Sort by value and put it into the "result" map
ipMap.entrySet().stream()
.sorted(Map.Entry.<String, Long>comparingByValue().reversed())
.forEachOrdered(x -> result.put(x.getKey(), x.getValue()));
// Print only if count > 50
result.entrySet().stream().filter(entry -> entry.getValue() > 50).forEach(entry ->
System.out.println("||\t" + entry.getValue() + " \t||\t" + entry.getKey())
);
// long endTime = System.currentTimeMillis();
// System.out.println("Time: "+(endTime-startTime));
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Using Actors:
1. The Main Class
public static void main(String[] args) {
long startTime = System.currentTimeMillis();
// Create actorSystem
ActorSystem akkaSystem = ActorSystem.create("akkaSystem");
// Create first actor based on the specified class
ActorRef coordinator = akkaSystem.actorOf(Props.create(FileAnalysisActor.class));
// Create a message including the file path
FileAnalysisMessage msg = new FileAnalysisMessage("log.txt");
// Send a message to start processing the file. This is a synchronous call using 'ask' with a timeout.
Timeout timeout = new Timeout(6, TimeUnit.SECONDS);
Future<Object> future = Patterns.ask(coordinator, msg, timeout);
// Process the results
final ExecutionContext ec = akkaSystem.dispatcher();
future.onSuccess(new OnSuccess<Object>() {
@Override
public void onSuccess(Object message) throws Throwable {
if (message instanceof FileProcessedMessage) {
printResults((FileProcessedMessage) message);
// Stop the actor system
akkaSystem.shutdown();
}
}
private void printResults(FileProcessedMessage message) {
System.out.println("================================");
System.out.println("||\tCount\t||\t\tIP");
System.out.println("================================");
Map<String, Long> result = new LinkedHashMap<>();
// Sort by value and put it into the "result" map
message.getData().entrySet().stream()
.sorted(Map.Entry.<String, Long>comparingByValue().reversed())
.forEachOrdered(x -> result.put(x.getKey(), x.getValue()));
// Print only if count > 50
result.entrySet().stream().filter(entry -> entry.getValue() > 50).forEach(entry ->
System.out.println("||\t" + entry.getValue() + " \t||\t" + entry.getKey())
);
long endTime = System.currentTimeMillis();
System.out.println("Total time: "+(endTime - startTime));
}
}, ec);
}
消息类
private String fileName;
public FileAnalysisMessage(String file) {
this.fileName = file;
}
public String getFileName() {
return fileName;
}
}public class FileProcessedMessage {
private Map<String, Long> data;
public FileProcessedMessage(Map<String, Long> data) {
this.data = data;
}
public Map<String, Long> getData() {
return data;
}
}
}
4.日志行消息
public class LogLineMessage {
private String data;
public LogLineMessage(String data) {
this.data = data;
}
public String getData() {
return data;
}
}
我正在为文件中的每一行创建一个参与者。对于所有并发框架,在部署的并发量与每个并发单元所涉及的复杂性之间始终存在权衡。阿克卡也不例外 在非akka方法中,每条线都有一个相对简单的步骤序列:
LogLineMessage
消息LineProcessingResult
消息Actor
,而是让每个Actor处理N行到其自己的子hashmap中(例如,每个Actor处理1000行):
这样,参与者就不会返回像IP地址这样简单的东西。相反,它将为其行子集发送计数哈希:
public class LineProcessingResult {
private HashMap<String, Long> ipAddressCount;
public LineProcessingResult(HashMap<String, Long> count) {
this.ipAddressCount = Count;
}
public HashMap<String, Long> getIpAddress() {
return ipAddressCount;
}
}
这将允许文件IO、行处理和子映射组合并行运行。您需要显示为此比较编写的代码。感谢您的帮助。
private String ipAddress;
public LineProcessingResult(String ipAddress) {
this.ipAddress = ipAddress;
}
public String getIpAddress() {
return ipAddress;
}
public class LogLineMessage {
private String data;
public LogLineMessage(String data) {
this.data = data;
}
public String getData() {
return data;
}
}
public class LogLineMessage {
private String[] data;
public LogLineMessage(String[] data) {
this.data = data;
}
public String[] getData() {
return data;
}
}
public class LineProcessingResult {
private HashMap<String, Long> ipAddressCount;
public LineProcessingResult(HashMap<String, Long> count) {
this.ipAddressCount = Count;
}
public HashMap<String, Long> getIpAddress() {
return ipAddressCount;
}
}
//inside of FileAnalysisActor
else if (message instanceof LineProcessingResult) {
HashMap<String,Long> localCount = ((LineProcessingResult) message).getIpAddressCount();
localCount.foreach((ipAddress, count) -> {
ipMap.put(ipAddress, ipMap.getOrDefault(ipAddress, 0L) + count);
})
FileReader fr = new FileReader(file);
BufferedReader br = new BufferedReader(fr);
String[] lineBuffer;
int bufferCount = 0;
int N = 1000;
String line = br.readLine();
while(line!=null) {
if(0 == bufferCount)
lineBuffer = new String[N];
else if(N == bufferCount) {
Props props = Props.create(LogLineProcessor.class);
ActorRef lineProcessorActor = this.getContext().actorOf(props);
lineProcessorActor.tell(new LogLineMessage(lineBuffer),
this.getSelf());
bufferCount = 0;
continue;
}
lineBuffer[bufferCount] = line;
br.readLine();
bufferCount++;
}
//handle the final buffer
if(bufferCount > 0) {
Props props = Props.create(LogLineProcessor.class);
ActorRef lineProcessorActor = this.getContext().actorOf(props);
lineProcessorActor.tell(new LogLineMessage(lineBuffer),
this.getSelf());
}