digitalmars.D - Buffer size policy when using std.experimental.allocator.
- QAston (28/28) Jan 28 2016 Hi.
- Rikki Cattermole (2/29) Jan 28 2016 Do one at a time but support reserving many more is the way I'd go.
Hi. I have the following code: /++ A queue like buffer that's first filled up with insertBack, then depopulated using removeFront() until it's empty. +/ struct TransducerBuffer (T, Allocator) { private Allocator _allocator; private T[] _array; void insertBack(auto ref T newElem) { // the interesing part } T removeFront() { } size_t length() { } void clear() { } } The buffer will allocate elements one by one (total amount is not known beforehand). My question is: Should I just use expandArray(_allocator, _array, 1 newElem) and count on the allocator to provide sensible allocation size policy, or should I do the policy myself by for example requesting expandArray(_allocator, _array, 2*lastAllocationSize newElem)?
Jan 28 2016
On 29/01/16 3:58 AM, QAston wrote:Hi. I have the following code: /++ A queue like buffer that's first filled up with insertBack, then depopulated using removeFront() until it's empty. +/ struct TransducerBuffer (T, Allocator) { private Allocator _allocator; private T[] _array; void insertBack(auto ref T newElem) { // the interesing part } T removeFront() { } size_t length() { } void clear() { } } The buffer will allocate elements one by one (total amount is not known beforehand). My question is: Should I just use expandArray(_allocator, _array, 1 newElem) and count on the allocator to provide sensible allocation size policy, or should I do the policy myself by for example requesting expandArray(_allocator, _array, 2*lastAllocationSize newElem)?Do one at a time but support reserving many more is the way I'd go.
Jan 28 2016