Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala Akka记录模式不可更改_Scala_Logging_Akka_Slf4j_Actor - Fatal编程技术网

Scala Akka记录模式不可更改

Scala Akka记录模式不可更改,scala,logging,akka,slf4j,actor,Scala,Logging,Akka,Slf4j,Actor,我正在为我的学士学位论文构建一个小的actor应用程序,现在我正在尝试添加一些日志记录 对于actorsystem内部的日志记录,我想使用akka在后台提供的logback classic日志记录。 到目前为止,日志记录工作正常,但是当我在logback.xml中更改它时,actor系统内部的模式没有改变 有人知道如何通过slf4j和actorsystem内部的日志记录来改变全局模式吗 参与者系统外的日志输出: 2014-07-11 13:03:09 INFO model.AccessLaye

我正在为我的学士学位论文构建一个小的actor应用程序,现在我正在尝试添加一些日志记录

对于actorsystem内部的日志记录,我想使用akka在后台提供的logback classic日志记录。 到目前为止,日志记录工作正常,但是当我在logback.xml中更改它时,actor系统内部的模式没有改变

有人知道如何通过slf4j和actorsystem内部的日志记录来改变全局模式吗

参与者系统外的日志输出:

2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 10 with id:188586187441591506  `
2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 11 with id:188586187442115794`
[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-4] [akka://EBTreeSimulation/user/nodeA] nodeA[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10 

[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-3] [akka://EBTreeSimulation/user/nodeB] nodeB[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10
import akka.event.Logging
import com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load.getConfig("akka"))
   val log = Logging.getLogger(system,this)
   ....
}

import akka.event.Logging

class TreeActor[T](communication:ActorRef) extends Actor {
   val log = Logging(context.system,this)
   ....

}
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <File>./logs/myLog.log</File>
    <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
        <fileNamePattern>logs/myLog.%i.log.zip</fileNamePattern>
        <minIndex>1</minIndex>
        <maxIndex>3</maxIndex>
    </rollingPolicy>
    <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
        <maxFileSize>5MB</maxFileSize>
    </triggeringPolicy>
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %level %X{sourceThread} %logger{10} [%file:%line]: %msg%n</pattern>
    </encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!--<target>System.out</target>-->
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %-5level %logger : %msg%n</pattern>
    </encoder>
</appender>
<!--<logger name="akka" level="INFO" />-->
<root level="info">
    <appender-ref ref="FILE"/>
    <appender-ref ref="STDOUT" />
</root>
akka {
    # Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
    # to STDOUT)
    loggers = ["akka.event.slf4j.Slf4jLogger"]
    #event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]

    loglevel = "INFO"   
    stdout-loglevel = "DEBUG"
    actor {
       provider = "akka.cluster.ClusterActorRefProvider"
        default-dispatcher {
            throughput = 10
        }
    }
    remote {
        netty.tcp.port = 4711
    }
}
记录参与者系统内部的输出:

2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 10 with id:188586187441591506  `
2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 11 with id:188586187442115794`
[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-4] [akka://EBTreeSimulation/user/nodeA] nodeA[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10 

[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-3] [akka://EBTreeSimulation/user/nodeB] nodeB[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10
import akka.event.Logging
import com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load.getConfig("akka"))
   val log = Logging.getLogger(system,this)
   ....
}

import akka.event.Logging

class TreeActor[T](communication:ActorRef) extends Actor {
   val log = Logging(context.system,this)
   ....

}
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <File>./logs/myLog.log</File>
    <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
        <fileNamePattern>logs/myLog.%i.log.zip</fileNamePattern>
        <minIndex>1</minIndex>
        <maxIndex>3</maxIndex>
    </rollingPolicy>
    <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
        <maxFileSize>5MB</maxFileSize>
    </triggeringPolicy>
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %level %X{sourceThread} %logger{10} [%file:%line]: %msg%n</pattern>
    </encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!--<target>System.out</target>-->
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %-5level %logger : %msg%n</pattern>
    </encoder>
</appender>
<!--<logger name="akka" level="INFO" />-->
<root level="info">
    <appender-ref ref="FILE"/>
    <appender-ref ref="STDOUT" />
</root>
akka {
    # Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
    # to STDOUT)
    loggers = ["akka.event.slf4j.Slf4jLogger"]
    #event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]

    loglevel = "INFO"   
    stdout-loglevel = "DEBUG"
    actor {
       provider = "akka.cluster.ClusterActorRefProvider"
        default-dispatcher {
            throughput = 10
        }
    }
    remote {
        netty.tcp.port = 4711
    }
}
slf4j的记录器初始化:

2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 10 with id:188586187441591506  `
2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 11 with id:188586187442115794`
[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-4] [akka://EBTreeSimulation/user/nodeA] nodeA[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10 

[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-3] [akka://EBTreeSimulation/user/nodeB] nodeB[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10
import akka.event.Logging
import com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load.getConfig("akka"))
   val log = Logging.getLogger(system,this)
   ....
}

import akka.event.Logging

class TreeActor[T](communication:ActorRef) extends Actor {
   val log = Logging(context.system,this)
   ....

}
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <File>./logs/myLog.log</File>
    <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
        <fileNamePattern>logs/myLog.%i.log.zip</fileNamePattern>
        <minIndex>1</minIndex>
        <maxIndex>3</maxIndex>
    </rollingPolicy>
    <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
        <maxFileSize>5MB</maxFileSize>
    </triggeringPolicy>
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %level %X{sourceThread} %logger{10} [%file:%line]: %msg%n</pattern>
    </encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!--<target>System.out</target>-->
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %-5level %logger : %msg%n</pattern>
    </encoder>
</appender>
<!--<logger name="akka" level="INFO" />-->
<root level="info">
    <appender-ref ref="FILE"/>
    <appender-ref ref="STDOUT" />
</root>
akka {
    # Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
    # to STDOUT)
    loggers = ["akka.event.slf4j.Slf4jLogger"]
    #event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]

    loglevel = "INFO"   
    stdout-loglevel = "DEBUG"
    actor {
       provider = "akka.cluster.ClusterActorRefProvider"
        default-dispatcher {
            throughput = 10
        }
    }
    remote {
        netty.tcp.port = 4711
    }
}
导入org.slf4j.LoggerFactory

class AccessLayer[T](communicationLayer:ActorRef, actors:List[ActorRef]) {
  val log = LoggerFactory.getLogger(classOf[AccessLayer[T]])
  ....
}
actorSystem的记录器初始化:

2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 10 with id:188586187441591506  `
2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 11 with id:188586187442115794`
[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-4] [akka://EBTreeSimulation/user/nodeA] nodeA[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10 

[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-3] [akka://EBTreeSimulation/user/nodeB] nodeB[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10
import akka.event.Logging
import com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load.getConfig("akka"))
   val log = Logging.getLogger(system,this)
   ....
}

import akka.event.Logging

class TreeActor[T](communication:ActorRef) extends Actor {
   val log = Logging(context.system,this)
   ....

}
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <File>./logs/myLog.log</File>
    <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
        <fileNamePattern>logs/myLog.%i.log.zip</fileNamePattern>
        <minIndex>1</minIndex>
        <maxIndex>3</maxIndex>
    </rollingPolicy>
    <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
        <maxFileSize>5MB</maxFileSize>
    </triggeringPolicy>
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %level %X{sourceThread} %logger{10} [%file:%line]: %msg%n</pattern>
    </encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!--<target>System.out</target>-->
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %-5level %logger : %msg%n</pattern>
    </encoder>
</appender>
<!--<logger name="akka" level="INFO" />-->
<root level="info">
    <appender-ref ref="FILE"/>
    <appender-ref ref="STDOUT" />
</root>
akka {
    # Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
    # to STDOUT)
    loggers = ["akka.event.slf4j.Slf4jLogger"]
    #event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]

    loglevel = "INFO"   
    stdout-loglevel = "DEBUG"
    actor {
       provider = "akka.cluster.ClusterActorRefProvider"
        default-dispatcher {
            throughput = 10
        }
    }
    remote {
        netty.tcp.port = 4711
    }
}
logback.xml:

2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 10 with id:188586187441591506  `
2014-07-11 13:03:09 INFO  model.AccessLayer : New Object 11 with id:188586187442115794`
[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-4] [akka://EBTreeSimulation/user/nodeA] nodeA[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10 

[INFO] [07/11/2014 13:03:09.199] [EBTreeSimulation-akka.actor.default-dispatcher-3] [akka://EBTreeSimulation/user/nodeB] nodeB[InsertNewObject] received new object:188586187441591506, 188586187441591506, 10
import akka.event.Logging
import com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load.getConfig("akka"))
   val log = Logging.getLogger(system,this)
   ....
}

import akka.event.Logging

class TreeActor[T](communication:ActorRef) extends Actor {
   val log = Logging(context.system,this)
   ....

}
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <File>./logs/myLog.log</File>
    <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
        <fileNamePattern>logs/myLog.%i.log.zip</fileNamePattern>
        <minIndex>1</minIndex>
        <maxIndex>3</maxIndex>
    </rollingPolicy>
    <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
        <maxFileSize>5MB</maxFileSize>
    </triggeringPolicy>
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %level %X{sourceThread} %logger{10} [%file:%line]: %msg%n</pattern>
    </encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!--<target>System.out</target>-->
    <encoder>
        <pattern>%date{YYYY-MM-dd HH:mm:ss} %-5level %logger : %msg%n</pattern>
    </encoder>
</appender>
<!--<logger name="akka" level="INFO" />-->
<root level="info">
    <appender-ref ref="FILE"/>
    <appender-ref ref="STDOUT" />
</root>
akka {
    # Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
    # to STDOUT)
    loggers = ["akka.event.slf4j.Slf4jLogger"]
    #event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]

    loglevel = "INFO"   
    stdout-loglevel = "DEBUG"
    actor {
       provider = "akka.cluster.ClusterActorRefProvider"
        default-dispatcher {
            throughput = 10
        }
    }
    remote {
        netty.tcp.port = 4711
    }
}

问题是akka加载的是默认配置,而不是自定义配置。要解决此问题,必须更改配置的加载:

导入akka.event.Logging 导入com.typesafe.config.ConfigFactory

object SimulationMaster extends App{
   val system = ActorSystem("EBTreeSimulation", ConfigFactory.load)
   val log = Logging.getLogger(system,this)
   ....
}

多亏了cmbaxter

其他答案有效地回答了这个问题。然而,我有几个其他相关的问题

要帮助调试此问题,可以按如下方式记录ActorSystem配置:

object SimulationMaster extends App{
   val conf = ConfigFactory.load
   val system = ActorSystem("EBTreeSimulation", conf)
   val log = Logging.getLogger(system,this)
   log info conf.getObject("akka").toString
   ....
}

在这个输出中,我看到,
akka.loggers
仍然被设置为一些默认记录器,而不是slf4j记录器。进一步查看配置,我发现设置了slf4j记录器,但它设置为单数
akka.logger
。我对它进行了多重化,然后一切都开始工作。

我认为您的问题可能与这一行有关
ConfigFactory.load.getConfig(“akka”)
。尝试将其更改为仅
ConfigFactory.load
。这样它就可以看到子元素“akka”并从中获得修改后的日志配置。很好,解决了这个问题!谢谢