language agnostic - Marshalling of a 32 bit int to a 16 bit int maching -
i want implement , understand concept of marshalling on own rpc mechanism (toy really). while idea behind endian-ness, not sure how handle 32bit , 16 bit ints. problem machine has int represented @ 32 bit , wants call function int foo(int x) on rpc call; server int represented 16 bit. sending lower 16 bits loose information , not desirable.
i know idl's work solve problem. in case lets use idl "defines" int 32 bit. while case works scenario, in case of machine 16 bit int, 2 bytes wasted when transmitting on network.
if flip idl 16bit, user has manually split local int , fancy, breaking transparency of rpc.
so right way used in actual implementations?
thanks.
usually, idls define several platform independent types (uint8, int8, uint16, int16, uint32, int32, uint64, int64) , few platform dependant, such int, uint. platform dependant types have limited use, such size/index of arrays. recommended use platform independent types else.
if parameter declared in idl int32 on platform must int32. if it's declared int, depends on platform.
for example, com varenum , variant , can see there platform independent types (such short (vt_ui2), long (vt_ui4), longlong (vt_ui8)) , machine types (such int (vt_int)).
Comments
Post a Comment