Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memoize with lifetime? #20

Open
hmeyer opened this issue Dec 30, 2022 · 2 comments
Open

memoize with lifetime? #20

hmeyer opened this issue Dec 30, 2022 · 2 comments

Comments

@hmeyer
Copy link

hmeyer commented Dec 30, 2022

Hey Lewin!
When writing a dynamic programming solution for Advent of Code I was really happy to find memoize!
But sadly it does not (yet) support the use case.
One reason is, that due to the current implementation references are not supported.

I don't have a good solution yet, but I wonder: What would it take to change memoize from global cache to a cache with a non-'static lifetime?

Cheers, Henning

@dermesser
Copy link
Owner

Hi Henning! Similarly to #18 or #19 this appears to be a question of whether memoize should be a silver bullet solution to all caching.

At the moment the goal is: "Decorate a function with this one directive, and it will just work*", where *) if your function has a simple signature and returns somewhat plain objects so that we don't have to care about lifetimes, references, etc.

Of course this is not enough for a generalized caching framework. That in turn would (I argue) torpedo the small and straight-forward feature set currently realized here. Given that using hashmaps is not very complex, in such a situation it seems easier for a developer to implement a small cache themselves.

With respect to non-static cache for memoize, one problem is that - at the moment - the memoize macro is used at the top level around function definitions. It is therefore not really possible (for all I know) to not use a 'static and global cache.
If what you need is a non-static, non-global cache, we come back to what I said above, and the simplest solution would be something along the lines of:

struct ComplexType(int32, String);
struct MySimpleCache<'a>(HashMap<&'a ComplexType, String>);

impl<'a> MySimpleCache<'a> {
  fn business_logic(&mut self, arg: &'a ComplexType) -> &str {
    // look up in own cache and/or recalculate.
  }
}

which gives you ultimate control and is not that complex either.

I hope I grasped your issue correctly, and please let me know if you have specific ideas that I don't see yet! :-)

All the best,
Lewin.

@hmeyer
Copy link
Author

hmeyer commented Dec 30, 2022

I have a vague idea, at least for the use case for dynamic programming problems.
Oftentimes those problems come in recursive form like so:

fn recursive_fn(args...) -> Foo {
  ...
  ...
  let partial_foo = recursive_fn(derived_args...);
  ...
  return result;
}

I wonder if one could write a function attribute which transforms this into:

fn recursive_fn(args...) -> Foo {
  return memoized_recursive_fn(&mut Hashmap::new(), args...);
}

fn memoized_recursive_fn(cache: &mut Hashmap<Args, Foo>, args...) -> Foo {
  if let Some(result) = cache.get(&args) {
    return result;
  }
  ...
  ...
  let partial_foo = memoized_recursive_fn(derived_args...);
  ...
  cache.insert(args, result);
  return result;
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants