Constrained MDPs and the reward hypothesis
It's been a looong ago that I posted on this blog. But this should not mean the blog is dead. Slow and steady wins the race, right? Anyhow, I am back and today I want to write about constrained Markovian Decision Process (CMDPs). The post is prompted by a recent visit of Eugene Feinberg , a pioneer of CMDPs, of our department, and also by a growing interest in CMPDs in the RL community (see this , this , or this paper). For impatient readers, a CMDP is like an MDP except that there are multiple reward functions, one of which is used to set the optimization objective, while the others are used to restrict what policies can do. Now, it seems to me that more often than not the problems we want to solve are easiest to specify using multiple objectives (in fact, this is a borderline tautology!). An example, which given our current sad situation is hard to escape, is deciding what interventions a government should apply to limit the spread of a virus while maintaining economic ...
Sorry guys, the fix does not work for some reason. It has worked first, but now it stopped working.
ReplyDeleteHi,
ReplyDeleteI know very little about Mac OS but this issue seemed to be interesting and then I read your comment that it didn't work after first try. The link that you have mentioned talks about one more setting for Thunderbird. It says:
"For Thunderbird get to the config editor via Thunderbird > Preferences… > General > Config Editor… and add the same Boolean preference"
You haven't mentioned this in your post. So I was just wondering if you have tried this trick too?
--
Girija
Hi Girija,
ReplyDeleteYes, I looked at that.
In my Thunderbird (2.0.0.23, latest) there is no "Config Editor.." accessible from the preferences.
- Csaba
Hi,
ReplyDeleteI have Thunderbird 2.0.0.22. I found this setting under Advanced tab of Preferences. You might want to check that once.
--
Girija
Girija,
ReplyDeleteThanks, this is great: I have found this setting there, too. And what is even better is that it works again!! Perfect! Wow, you made my day!
Thanks,
Csaba
PS: I will update my post later to reflect to current state of the art.
:)
ReplyDeleteGreat to hear that! You are welcome!
--
Girija
thanks a lot !
ReplyDelete3 customers asked me to solve this !
great tips !
Sebastien