www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - Compile-Time Size Checking of Enum Members of std.bitmanip.bitfields

reply =?UTF-8?B?Ik5vcmRsw7Z3Ig==?= <per.nordlow gmail.com> writes:
The following code example

import std.stdio, std.bitmanip;

enum E2 { a, b, c, d, e }

immutable bf = bitfields!(uint, "x", 6,
                           E2, "e2", 2);

struct A { mixin(bf); }

void main(string[] args)
{
     A obj;
     obj.x = 2;
     obj.e2 = E2.a;

     import core.exception: AssertError;
     try
     {
         obj.e2 = E2.e;
         assert(false, "Exception not caught");
     }
     catch (core.exception.AssertError e) { /* ok to throw */ }
}

shows how brilliantly generic D is with regards to the

std.bitmanip.bitfields

However, wouldn't it be better to detect the mismatches between 
enum bit-sizes and bitfield lengths at compile-time instead of at 
run-time?

This is possible because all size information is available at 
compile-time as template parameters to bitfields.
Jan 19 2015
parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Nordlöw:

 wouldn't it be better to detect the mismatches between enum 
 bit-sizes and bitfield lengths at compile-time instead of at 
 run-time?
File an enhancement and/or submit a Phobos patch. Bye, bearophile
Jan 19 2015
parent reply "Per =?UTF-8?B?Tm9yZGzDtnci?= <per.nordlow gmail.com> writes:
On Monday, 19 January 2015 at 11:23:11 UTC, bearophile wrote:
 File an enhancement and/or submit a Phobos patch.
Ok, great. I'll try fixing this in a PR. Further...I propose to enhance `bitfields` to automatically deduce bitfield lengths in the following way. autoBitfields!(ubyte, "x", 3, // explicit length E, "e", -1, // field length is deduced ubyte, "y", 3 // padding to whole bytes is deduced ) Typical deductions are - enums: E.max - E.min + 1 (this requires offsetting logic in packing) - bool: 1 Alternatively I guess we could use a specific enum type to indicate that the field length should be deduced to minimum required length. I guess we could add a template overload to bitfields with a specific first enum parameter that triggers size deduction, instead of autoBitfields aswell. What do you think?
Jan 19 2015
parent reply "Per =?UTF-8?B?Tm9yZGzDtnci?= <per.nordlow gmail.com> writes:
On Monday, 19 January 2015 at 11:40:07 UTC, Per Nordlöw wrote:
 Typical deductions are
 - enums: E.max - E.min + 1 (this requires offsetting logic in
I guess a trait for this, say enum bitsizeOf(E) = return bitsNeeded(E.max - E.min + 1); is motivated aswell, if it doesn't already exists... What's the name of `bitsNeeded` (binary Power) in Phobos?
Jan 19 2015
parent "Per =?UTF-8?B?Tm9yZGzDtnci?= <per.nordlow gmail.com> writes:
On Monday, 19 January 2015 at 11:49:29 UTC, Per Nordlöw wrote:
 What's the name of `bitsNeeded` (binary Power) in Phobos?
core.bitop.bsr For details see: http://forum.dlang.org/thread/okonqhnxzqlqtxijxsfg forum.dlang.org#post-kscrsodwmslgveptrxmx:40forum.dlang.org
Jan 19 2015